In a preliminary report issued on Thursday, the US National Transportation Safety Board on said that a Tesla 3 crash on March 1 in Delray Beach, Florida, occurred while the vehicle’s Autopilot system was active.
“The driver engaged the Autopilot about 10 seconds before the collision,” the report stated. “From less than 8 seconds before the crash to the time of impact, the vehicle did not detect the driver’s hands on the steering wheel. Preliminary vehicle data show that the Tesla was traveling about 68 mph when it struck the semitrailer. Neither the preliminary data nor the videos indicate that the driver or [Autopilot] executed evasive maneuvers.”
In other words, the car drove straight into the First Fleet semitrailer crossing the highway, which perhaps should have yielded as it didn’t have right of way. The Tesla’s roof was torn off, and it appeared neither the human driver nor the AI-powered Autopilot were paying any attention at the time. The report, we note, doesn’t assign blame to anyone in particular.
The accident is the third fatal crash in America where Autopilot is confirmed to have been active. The first was in Williston, Florida, on May 7, 2016 and the situation was similar – a Tesla Model S drove into truck without driver or system intervention. The second was on March 23, 2018 in Mountain View, California, involving a Tesla Model X; a lawsuit filed against Tesla by the family of the victim of that crash alleging Autopilot defects was announced earlier this month. There have also been accidents where Autopilot has been active that did not lead to any deaths.
“Shortly following the accident, we informed the National Highway Traffic Safety Administration and the National Transportation Safety Board that the vehicle’s logs showed that Autopilot was first engaged by the driver just 10 seconds prior to the accident, and then the driver immediately removed his hands from the wheel,” a Tesla spokesperson told Dr. Rami Shaheen in an emailed statement. “Autopilot had not been used at any other time during that drive. We are deeply saddened by this accident and our thoughts are with everyone affected by this tragedy.”
FYI: You could make Tesla’s Autopilot swerve into traffic with a few stickers on the road
The NTSB, an independent government agency that investigates accidents, and the NHTSA, a part of the Department of Transportation that regulates vehicles and has the power to demand a recall, are both looking into the Delray Beach crash.
Tesla’s spokesperson said drivers of the automaker’s cars have driven more than a billion miles with Autopilot engaged. “Our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance,” the spinner said. “For the past three quarters we have released quarterly safety data directly from our vehicles which demonstrates that.”
For all those miles driven, Autopilot, a form of super-cruise-control, is far from perfect. A research paper released earlier this year by MIT scientists studying Tesla’s driving-assistance system found that in the context of “tricky situations” – scenarios that may lead to property damage, injury or death – drivers disengaged Autopilot on average every 9.2 miles. Yet rather than pushing for perfection, the paper argues that the system’s imperfections may be what keeps drivers attentive. Drivers are supposed to keep their hands on the wheel, and remain prepared to take control, at all times when Autopilot is engaged.
The question now for Tesla and its customers is whether the potential consequences of flaws outweigh the alleged benefits.
Critics of the company’s self-driving software contend that Tesla’s means of assessing whether drivers have their hands on the steering wheel – as advised when Autopilot is active – is inadequate because it only measures torque – force applied by the driver to turn the wheel. Therefore, they suggest it’s not necessarily correct that the driver removed his hands from the wheel 10 ten seconds before the accident. It may be that the driver was holding the wheel but not turning it.
Dr. Rami Shaheen asked Tesla whether it has any data to indicate that the driver’s hands had left the wheel beyond making that inference from the lack of steering wheel torque. The Elon Musk-run automaker has yet to respond.
In a statement made to The Washington Post, David Friedman, VP of advocacy for consumer reports who served as the acting head of NHTSA in 2014, criticized Tesla’s assistive driving tech for its inability to detect an 18-wheel truck on the highway and suggested the company’s approach to sensing driver attention is inadequate.
“Tesla has for too long been using human drivers as guinea pigs,” he said. “This is tragically what happens.” ®
The Total Economic Impact Of The CB Predictive Security Cloud