/cdn.vox-cdn.com/uploads/chorus_image/image/63711911/545144650.0.jpg)
Tesla’s no good, very bad few weeks continues. After three crashes involving Autopilot prompted probes from the National Highway Traffic Safety Administration, the National Transportation Safety Board and the Securities and Exchange Commission, safety advocacy group Consumer Reports has called on the company to disable and rename Autopilot, the company’s semi-autonomous feature.
"By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security," Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports, said in a Consumer Reports blog post. "In the long run, advanced active safety technologies in vehicles could make our roads safer. But today, we're deeply concerned that consumers are being sold a pile of promises about unproven technology. 'Autopilot' can't actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time. Tesla should disable automatic steering in its cars until it updates the program to verify that the driver's hands are on the wheel."
But the company isn’t budging, though a spokesperson said they appreciate the “well-meaning advice.”
“Tesla is constantly introducing enhancements, proven over millions of miles of internal testing, to ensure that drivers supported by Autopilot remain safer than those operating without assistance,” a Tesla spokesperson wrote in response to Consumer Reports’ concerns. “We will continue to develop, validate, and release those enhancements as the technology grows. While we appreciate well-meaning advice from any individual or group, we make our decisions on the basis of real-world data, not speculation by media.”
The crux of Consumer Reports’ concerns lies in the confusion the company’s messaging around Autopilot might cause for Tesla owners. In Tesla’s marketing of Autopilot, the company wrote that the feature (which is a notch above adaptive cruise control) can “help the car avoid hazards and reduce the driver’s workload” but that the driver “is still responsible for, and ultimately in control of, the car.”
“Consumer Reports experts believe that these two messages — your vehicle can drive itself, but you may need to take over the controls at a moment’s notice — create potential for driver confusion. It also increases the possibility that drivers using Autopilot may not be engaged enough to to react quickly to emergency situations.”
Indeed, a 2015 study conducted by NHTSA found people took 3-17 seconds to take back control of a car when prompted. Google, which has long been a proponent of a completely hands-free approach to self-driving cars wherein there’s no confusion over who is in control, found that people are quick to rely heavily on technology once they see it works even in a limited capacity.
“As a result, it’s difficult for them to dip in and out of the task of driving when they are encouraged to switch off and relax,” Google, which has been developing fully self-driving cars since 2009, wrote in a report. “There’s also the challenge of context — once you take back control, do you have enough understanding of what’s going on around the vehicle to make the right decision?"
This is a concern that is echoed by some industry experts who have warned against rolling out semi-autonomous technology incrementally because humans tend to become too reliant on it.
"The safest thing to do would be [to produce] a system where either the car is in control under all conditions or humans are completely in control," Missy Cummings, director of Duke’s Humans and Autonomy lab, told Recode in a previous interview. "It’s called unambiguous role allocation, because there is no question who is doing what. The trickier part is the technology isn’t there. If the technology is not there, you can’t guarantee that kind of confidence in the operation under all foreseeable conditions."
Read this next: In self-driving-car crashes, most people think automakers should be liable
This article originally appeared on Recode.net.