clock menu more-arrow no yes mobile

Filed under:

Tesla needs to fix its deadly Autopilot problem

Tesla is facing heat from federal officials following an investigation into a fatal crash involving its Autopilot.

A mangled Tesla car on a highway after crashing into a highway barricade.
A Tesla electric car crashed into a highway barrier in Mountain View, California, on March 23, 2018. Investigators confirmed that Autopilot was partially to blame.
Rebecca Heilweil covered emerging technology, artificial intelligence, and the supply chain.
Open Sourced logo

Tesla is facing heat from federal officials following another fatal accident involving Autopilot. The National Transportation Safety Board (NTSB) recently found that Tesla’s semi-autonomous driving feature was partially to blame in a 2018 fatal car crash, adding yet another accident to the technology’s already worrisome record. What’s even more concerning is that Tesla doesn’t appear too interested in addressing these concerns.

That Tesla’s Autopilot has been implicated in a crash isn’t new. In fact, after this investigation, NTSB chairman Robert Sumwalt pointed out that in 2017 his agency called on Tesla and five other carmakers to limit self-driving features and to build better technology to monitor drivers in semi-autonomous cars. Tesla is the only company that hasn’t formally responded to those recommendations, though it did start warning drivers more quickly when they take their hands off the wheel.

But it seems the company is unwilling to address its self-driving technology’s shortcomings — or to ensure that its drivers properly understand what the Autopilot feature can and can’t do. The NTSB’s findings serve as a stark reminder that the federal government has a role to play in regulating these technologies, and furthermore, its light-touch approach doesn’t seem to be working.

“We urge Tesla to continue to work on improving Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that corrective action is taken when necessary,” Sumwalt told reporters. “It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars.”

Here’s the background: Two years ago, a 2017 Model X that had its Autopilot feature engaged was driving along a highway in Mountain View, California, when it struck a concrete barrier at a speed over 70 miles an hour. The crash was ultimately fatal for the driver, who died of injuries related to blunt force trauma.

After a months-long investigation, the agency identified seven safety issues related to the crash, including limitations to Tesla’s crash avoidance system and driver distraction. Among them, it appears that the driver was playing a game on an iPhone provided by his employer, Apple, and that he didn’t notice when the Autopilot steered the electric vehicle off-course.

“The Tesla Autopilot system did not provide an effective means of monitoring the driver’s level of engagement with the driving task, and the timing of alerts and warnings was insufficient to elicit the driver’s response to prevent the crash or mitigate its severity,” reads the report. “Tesla needs to develop applications that more effectively sense the driver’s level of engagement and that alert drivers who are not engaged.”

The board also found that Tesla needed a better system for avoiding collisions. Like many semi-autonomous driving systems, Tesla’s Autopilot can only detect and respond to situations that it is programmed and trained to deal with. In this case, the Tesla Model X software never detected a crash attenuator — a barrier intended to reduce impact damage that was damaged and not in use at the time of the crash — causing the car to accelerate.

Tesla didn’t respond to Recode’s request for comment by the time of publication.

So what happens now? Tesla has argued that its cars are safer than average vehicles, but these crashes keep happening, and fatal crashes involving Autopilot seem increasingly common. Meanwhile, Consumer Reports has continued to find issues with vehicles with these autonomous abilities. Last year, the organization reported that Autopilot’s Navigate feature could lag “far behind a human driver’s skills.”

Security researchers have also said that it wouldn’t take too much to trick these vehicles. Researchers have shown how placing stickers on the road could coax a Tesla into dangerously switching lanes while the Autopilot system was engaged. And last week, the computer security company McAfee released findings that a Tesla using the intelligent cruise control feature could be tricked into speeding by placing a small strip of electric tape onto speed limit signs.

Shortcomings like these are why it’s so important for drivers to pay attention. Nearly three years ago, the NTSB called for car companies implementing these autonomous systems like Autopilot to create better mechanisms for monitoring drivers while these tools are turned on, in part to alert them when they need to take control of the vehicle. Tesla is the only auto company of six that hasn’t formally responded to the federal agency.

Meanwhile, research from the Insurance Institute for Highway Safety, a nonprofit that’s supported by car insurance companies, found that drivers can misunderstand the autonomous capabilities of their vehicles, including Tesla’s Autopilot.

And Tesla is known for overstating its vehicles’ abilities. On and off in recent years, the company has described its cars as having “full self-driving capabilities” or has advertised that the vehicles have “full self-driving hardware,” despite the need for drivers to stay engaged while on the road. Whenever criticism over this sort of marketing language reaches a breaking point, however, Tesla has removed the language. The Tesla website currently paints a confusing picture of its cars capabilities:

Screenshot from Tesla’s site.

All that marketing copy aside, a Tesla using the Autopilot feature is nowhere near a fully autonomous car. The issues that have cropped up around Autopilot have raised concerns about the new safety issues that self-driving vehicles could introduce. More importantly, these issues have bolstered demands for regulators to test this technology more stringently — and hold carmakers accountable when they build dangerous tech.

Whether or not that will actually happen is unclear. The Trump administration has, in fact, encouraged federal agencies not to “needlessly hamper” innovation in artificial intelligence-based technology, and, earlier this year at the Consumer Electronics Show (CES) in Las Vegas, the Department of Transportation Secretary Elaine Chao announced new rules that are meant to standardize and propel the development of self-driving cars. Those rules won’t do much good if companies leading the charge toward this futuristic technology, like Tesla, refuse to follow or even acknowledge them.

So it’s time for Tesla to do something different. At the very least, the company could answer government regulators’ calls to develop better ways to monitor drivers as it continues to improve its self-driving technology. Obviously, Autopilot doesn’t live up to its name quite yet, so either the company fixes it, or it can risk endangering the lives of its drivers.

For now, please don’t text and drive. It’s dangerous. And if you own a Tesla, definitely don’t text and drive — or play a mobile game — when you’re using Autopilot. That’s potentially even more dangerous, since you might feel a false sense of security. Overestimating the abilities of technology like Autopilot puts your life and the lives of others at risk.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.