clock menu more-arrow no yes mobile

A federal agency says an overreliance on Tesla’s Autopilot contributed to a fatal crash

The National Transportation Safety Board met on Tuesday to determine the cause of May’s fatal Tesla crash.

Images of the Tesla and the truck after a fatal collision NTSB

The National Transportation Safety Board said that its investigation into a fatal Tesla crash showed that the limitations of the company’s automated driver system, Autopilot, played a role in the May 2016 collision.

The board unanimously voted to accept that the probable cause of the crash included a combination of factors. The first was the failure of the driver of the truck that the Tesla collided with to yield to the car, and second was the driver’s inattentiveness due to his overreliance on Autopilot.

The proposal for probable cause reads in part:

“Contributing to the car driver’s overreliance on the vehicle automation was its operational design, which permitted his prolonged disengagement from the driving task and his use of the automation in ways inconsistent with guidance and warnings from the manufacturer.”

This was the first known fatal crash that involved a car that was using automated driving control systems.

The NTSB also voted to approve a series of seven safety recommendations that were directed at the U.S. Department of Transportation, the National Highway Traffic Safety Administration and other car makers with Autopilot-like features. At the core of those recommendations was an emphasis on ensuring these features can only be used within their set limitations. (Full list of recommendations are below.)

According to the NTSB, Tesla’s Autopilot operated within its limitations but was being used in ways and situations that it was not designed for. Moreover, the driver only engaged with the steering wheel seven times for 25 seconds over the course of 40 minutes before the crash.

“The pattern of the use of Autopilot, including the lack of steering wheel interaction and lack of response prior to the crash shows over-reliance on the automation,” NTSB member Dr. Ensar Becic said during the meeting. “The driver’s engagement during operation of a level 2 system is critical for safety.”

For that reason, Tesla’s system will continuously issue visual and then auditory warnings to ensure drivers are paying attention. There will be three warnings before the car slows down to a stop with its flashers on. However, the driver engaged with the steering wheel each time the car issued an initial warning.

But Dr. Becic said that simply putting your hands on the steering wheel is a “poor surrogate” for determining driver attention.

In fact, the last interaction the driver had with the car before the crash was one minute and 51 seconds before his Model S collided with the truck. The driver set his cruising speed at 74 mph, even though the maximum speed is automatically set to five miles over the speed limit. Again, while there are set limitations on speed for Autopilot, the user can still raise it beyond the limit.

The system’s cameras and radars also did not detect the semi-truck that was crossing the driver’s path before the fatal collision. However, the system was not designed to detect crossing traffic and is only intended for use on freeways or highways where there is limited access to crossing or entering cars.

Still, Autopilot continued to operate on that road with cross-traffic.

“The system really relies, even if the instructions are very clear, it relies on the driver to adhere to the design limitations of Autopilot,” Dr. Becic said during the meeting. “As I mentioned earlier, it would be much easier for a system to automatically establish that.”

"At Tesla, the safety of our customers comes first, and one thing is very clear: Autopilot significantly increases safety, as NHTSA has found that it reduces accident rates by 40%,” a Tesla spokesperson said. “We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology. We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”

The problem the board continued to hit on was though the Autopilot system operated within its limits, it was being used outside of how it was designed to be used. This, Dr. Becic argued, is something that can be remedied by developing the software so that it doesn’t work outside of its design limits.

The intention of the meeting was to determine the cause of the collision and learn from it. We’ve reached out to Tesla for comment and will update when we hear back.

Here’s the full list of safety recommendations the board proposed and approved on Tuesday:

To U.S. DOT:

Find the data parameters needed to understand [if] the automated vehicle control system [was] involved in a crash. The parameters must reflect the vehicle’s control status and the frequency and duration of control actions to adequately characterize driver and vehicle performance before and during a crash.

To National Highway Traffic Safety Administration (which did not participate in the investigation thought the agency was asked to.)

Develop a method to verify that manufacturers of vehicles equipped with level 2 vehicle automation systems incorporate safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed.

Use the data parameters defined by the U.S. DOT and response to safety recommendation number one as a benchmark for new vehicles equipped with automated vehicle control systems so that they can capture data that reflects the vehicle’s control status and the frequency and duration of control actions. This captured data should be readily available to, at minimum, NTSB investigators and NHTSA regulators.

Define a standard format for reporting automated vehicle control system data and require manufacturers of vehicles to report incidents crashes and vehicle miles operated with such systems enabled.

To car makers with automated vehicle control systems including the U.S. arms of Audi, BMW, Infinity, Mercedes, Tesla, Volvo:

Incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed.

Develop applications to more effectively sense the driver’s level of engagement and alert the drive when engagement is lacking while automated vehicle control systems are in use.

To the Alliance of Automobile Manufacturers and Global Automakers

Notify your members of the importance of incorporating system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed.


This article originally appeared on Recode.net.