The autopilot system that controlled the vehicle during the crash was qualified as a driver assistance system. Its capability to take over the control over the car in an autopilot manner were added only later, as an after-sales OTA update – an update to the software downloaded to the car. Legally, it is highly questionable if such a car still has a proper admission to drive on public roads, and the authorities on both sides of the Atlantic Ocean are now investigating the case. Yes, Tesla warned its customers that this was a beta release and drivers should be alert all the time. But we all know how many customers actually read software license agreements, and Tesla should have know this as well – at latest after numerous videos have been posted on Youtube depicting drivers who explicitly did not have their hands at the wheel during the ride, and some not even sitting on the driver’s seat. So Tesla should have understood that it was its duty to act – and deactivate the system in question in situations when the driver did not act with appropriate attention.
This week we also learned that Mobileye, the supplier of the video system which failed to identify a large truck, warned that the software version in place was not qualified to detect crossing traffic; such a capability only was planned to be added to the software in a version scheduled for 2018. Tesla was aware of this limitation.
With its futuristic autopilot features, Tesla created the impression that progress is easy and just a matter of determination. The company even fueled a wave of technology ecstasy where sound judgment and soberness would have been more appropriate – even if it would have taken a bit more time to offer these features to the broad public. For good reasons it takes years to bring new car designs from the development stage to the