WhichCar
wheels

Tesla shoulders some of the blame for fatal Autopilot crash

Investigation into fatal Tesla Autopilot crash highlights driver error and software shortcomings

2017 A Tesla Model S yesterday
Gallery1

TESLA’S Autopilot system contributed to a fatal crash last year in which a Model S ploughed at highway speed into the side of a truck, killing the electric car’s driver, Joshua Brown, the US roads watchdog has found.

The National Transportation Safety Board overnight announced the findings of its inquiry into a May 2016 crash in Florida. As well as laying some blame on the truck driver who had failed to yield to the Tesla, the “car driver’s inattention due to overreliance on vehicle automation” was also identified as potentially contributing to the crash.

“The NTSB also determined the operational design of the Tesla’s vehicle automation permitted the car driver’s overreliance on the automation, noting its design allowed prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings,” it said.

NTSB chairman Robert Sumwalt said the Tesla crash showed that while automation in highway transportation had the potential to save “tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles”.

The board determined that Autopilot failed to detect the cross traffic and that both Brown and the truck driver had at least 10 seconds to “observe and respond to each other” in order to avoid the collision.

“Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla’s Autopilot system, are designed to assist drivers with specific tasks in limited environments,” he said.

“These systems require the driver to pay attention all the time and to be able to take over immediately when something goes wrong.

“System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened.”

Tesla is yet to issue a statement in response to the NTSB’s inquiry.

The report found the Tesla was unable to identify the semi trailer as it crossed into the Model S’s path. It added that Brown’s “pattern of use” indicated an over reliance on the assist systems built into the car, nor did he fully understand its limitations.

“The way in which the Tesla Autopilot system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement,” the report said.

“Tesla made design changes to its Autopilot system following the crash. The change reduced the period of time before the Autopilot system issues a warning/alert when the driver’s hands are off the steering wheel. The change also added a preferred road constraint to the alert timing sequence.”

The NTSB recommended car makers build in better safeguards to prevent drivers from stretching the relationship with autonomous vehicles, called for the car makers to collect even more information about what causes their semi-autonomous cars to crash, and for manufacturers to let the government know when one of their vehicles is involved in a crash.

Barry Park

COMMENTS

Please enable JavaScript to view the comments powered by Disqus.