Contrary to what you might suppose from the name, Tesla’s Autopilot driver assistance system is not meant to replace the driver. You can’t “go on autopilot” and let the car take over — at least not safely.
In its driver materials, Tesla makes clear that drivers using the Autopilot system must remain engaged with the task of driving. You are expected to remain vigilant and ready to take over if something goes wrong.
Unfortunately, a lot can go wrong. The National Transportation Safety Board (NTSB) has investigated a number of crashes in which the Autopilot feature was engaged before the collision. At least three of those crashes were fatal, and they appear to have involved drivers who assumed they would be safe letting Autopilot handle the driving. In some cases, the cars’ sensors failed to detect obstacles like pedestrians and trucks, and the drivers didn’t react in time.
This has led some to ask whether Tesla’s Autopilot, as opposed to similar systems with different names, misleads drivers into thinking they are safe letting Autopilot take over. The name itself could be contributing to the problem, but that’s not the only issue.
NTSB: Autopilot doesn’t keep drivers engaged
Recently, the NTSB investigated a high-profile January 2018 Tesla crash in California. The agency found that the driver of a Model S had engaged the Autopilot 13 minutes and 48 seconds before the car rammed into a fire truck that had been blocking the carpool lane on Interstate 405. The driver, who was eating, drinking and listening to the radio, did not touch the steering wheel for the final 3 minutes, 41 seconds. Luckily, no one was injured in the crash.
In its report on the accident, the NTSB concluded that the Autopilot system allows the driver to disengage from driving too easily and does not monitor whether the driver’s eyes are on the road. It also blamed the driver, who over-relied on the Autopilot system and didn’t pay sufficient attention to the road.
According to Consumer Reports, some driver-assist systems are better at keeping drivers’ attention on the road. The GM Super Cruise, for example, uses cameras to ensure the driver’s eyes are pointed in the direction of travel. If it senses the driver has turned away, it begins issuing a series of escalating warnings. If that doesn’t reengage the driver, the car will actually pull over.
By contrast, Tesla’s Autopilot only senses whether the driver’s hands are on the wheel. It doesn’t have sensors to ensure the driver’s eyes are on the road.
The agency also noted that, after a fatal Tesla crash in 2016, it recommended that Tesla and five other carmakers “develop applications to more effectively sense the driver’s level of engagement.” Since then, BMW, Mercedes-Benz, Nissan and Volkswagen responded to the agency with explanations about how their technology engages drivers. Tesla still hasn’t responded.
Tesla did respond to Consumer Reports, saying that the Autopilot “repeatedly reminds drivers of their responsibility to remain attentive and prohibits the use of Autopilot when warnings are ignored.” It also said it had introduced a number of updates.
Safety groups call for NHTSA to take corrective action
Both Consumer Reports and the Center for Auto Safety, called on the National Highway Traffic Safety Administration to order Tesla to recall its vehicles with Autopilot until a fix can be developed.
“The time to allow an unregulated, unsafe experiment on our roads is over,” said a spokesperson for the Center. “NHTSA needs to do its job by issuing rules and removing unsafe vehicles from the road until they can meet minimum performance standards.”