AI & RoboticsNews

NTSB says Tesla Autopilot was partly to blame for 2018 crash

The National Transportation Safety Board (NTSB) cited both driver error and Tesla’s Autopilot design as the probable causes of a January 2018 crash, in which a Model S slammed into a parked fire truck at about 31 mph. According to the report, the driver was distracted and did not see the fire truck. But NTSB says that Tesla’s Autopilot was also at fault, as its design “permitted the driver to disengage from the driving task.”

The driver reportedly had Autopilot engaged and was following closely behind a large SUV or truck. The lead vehicle changed lanes to move around a fire truck that was parked in the lane ahead. The Tesla driver claimed he was drinking coffee and eating a bagel and did not see the firetruck. When the lead vehicle changed lanes, the Model S accelerated. About .49 seconds before the crash, the vehicle detected a stationary object in the road and displayed a warning, but it was too late.

The vehicle’s Autopilot didn’t detect driver-applied steering wheel torque for the last three minutes and 41 seconds before the crash. And given the driver’s admitted distractions, NTSB says the driver was likely over-reliant on the vehicle’s driver assistance system.

Driver errors, Advanced Driver Assistance system Design, led to Jan. 22, 2018, Culver City, CA, highway crash, according to NTSB Highway Accident Brief 19/07 issued Wednesday; https://t.co/hozLB1zA7F pic.twitter.com/iyQNc2HdhT

According to Reuters, the Center for Auto Safety, a consumer watchdog group, said the NTSB report should prompt the National Highway Traffic Safety Administration (NHTSA) to “do its job and recall these vehicles … A vehicle that enables a driver to not pay attention, or fall asleep, while accelerating into a parked fire truck is defective and dangerous.” Engadget has reached out to Tesla for comment.

Tesla’s Autopilot was engaged in at least three fatal US crashes, two of which are still under investigation by the NTSB and NHTSA. Oddly enough, the January 2018 crash wasn’t the only time an admittedly distracted driver crashed into the back of a fire truck while using Autopilot. The system does issue “hands on warnings” and Tesla advises drivers to keep their hands on the wheel. But as others have pointed out, calling the driver assist features “Autopilot” may be a bit misleading.


Author: Christine Fisher
Source: Engadget


Related posts
DefenseNews

Navy, senators argue over who is to blame for a too-small fleet

DefenseNews

To expand the US Navy’s fleet, we must contract

DefenseNews

Ellis to succeed Rey as director of Army Network Cross-Functional Team

Cleantech & EV'sNews

Tesla asks shareholders to move to Texas and re-pass Elon Musk's massive compensation plan

Sign up for our Newsletter and
stay informed!