A recent report into a fatal 2016 crash involving a Tesla Model S car and an articulated truck has found that both drivers and the company share the blame for the incident, reigniting debate about the future of self-driving cars.
Joshua Brown, 40, was travelling along the highway near Williston, Florida on May 7, 2016, when both he and his car’s autopilot system failed to notice a truck making a left turn on the highway ahead.
“System safeguards were lacking,” NTSB Chairman Robert Sumwalt said as cited by Fortune. “Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention.”
The Model S struck the right side of the trailer smashing into the underside, shearing off the roof of the car in the process. The driver of the truck was uninjured but Brown died in the crash.
Tesla’s autopilot suite contains multiple systems, including Traffic-Aware Cruise Control and Autosteer lane-keeping systems, however, the pressure sensors on the steering wheel that are used to monitor a driver’s attentiveness proved insufficient to prevent disaster.
Tesla did update its autopilot suite in the interim but the latest findings contradict an investigation carried out by The National Highway Traffic Safety Administration, the government’s vehicle safety watchdog, which concluded in January that it was simply a case of human error.
The National Transportation Safety Board, an independent federal body that investigates plane, train, and vehicle crashes, concluded its own separate investigation Tuesday and found that Tesla bears some of the responsibility.
“The probable cause of the Williston, Florida, crash was the truck driver’s failure to yield the right of way to the car, combined with the car driver’s inattention due to overreliance on vehicle automation,” the watchdog’s press release states.
Neither driver was fatigued or distracted by their cell phone. The truck driver did have traces of marijuana in his system but not enough to impede his driving especially as there was sufficient line-of-sight for either driver to have prevented the crash.
The data from the autopilot module indicates the driver was over-reliant on the system and fundamentally misunderstood its limitations.
Tesla’s autopilot was not designed to operate in such circumstances which highlights an inherent flaw in the autopilot system: if it is not designed in such a way to prevent the car owner from relying exclusively on the system, then it is open to misuse. This contravenes the agreement each Tesla user must sign before gaining access to the autopilot feature.
Audi, BMW, Infiniti, Mercedes-Benz, Tesla Inc., and Volvo vehicles in the US have been developed with Level 2 vehicle automation systems, which constitutes ‘partial automation.’