clock menu more-arrow no yes

Tesla says its self-driving feature was not at fault in deadly crash

The company calls Fortune’s story “fundamentally incorrect.”

Tesla Debuts Its New Crossover SUV Model, Tesla X Justin Sullivan / Getty

Tesla said it did no wrong by waiting to disclose the fatal accident involving a car with its semi-autonomous software from May. And it’s doubling down on the claim that its system remains, despite the crash, safer than without it.

Fortune, however, published a story critical of Tesla’s decision to sit on the news for eight weeks, spotlighting the fact that CEO Elon Musk sold over $2 billion company shares in the interim.

Musk disputed the story (penned by award-winning Warren Buffett chronicler Carol Loomis) and on Wednesday his company released a lengthy rebuttal, calling it “fundamentally incorrect.”

Tesla says it knew less about the crash than Fortune claims, but the crux of its argument is even broader: Even if we knew more, that doesn’t matter; Autopilot is working just like we said it would.

That leads to Tesla’s bolder claim that Autopilot, its semi-autonomous driving software, wasn’t at fault:

In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle's position in lane and adjusts the vehicle's speed to match surrounding traffic.

Tesla’s defense here is that Autopilot is not a fully driverless system — an important distinction in the critical field of autonomous driving.

Tesla nonetheless insists the software is still an improvement on existing driving safety. The company calls a collision involving the software a “statistical inevitability,” yet argues the system’s safety record has crossed the “‘better-than-human’ threshold.” (Some experts in the field of autonomy are questioning that logic.)

Since Autopilot was released last year, many Tesla owners have very publicly displayed how they push the system well beyond Tesla’s description.

By Tesla’s definition, Autopilot is a souped-up cruise control. Its argument places the feature squarely into the category of a product liability. Drivers know what they are getting into. And so do shareholders — so Tesla was not obliged to share the crash with them. As evidence, the blog post points to the fact that Tesla’s shares “traded up” the day when the regulatory investigation was announced. (In fact, they sunk downward before moving back up.)

Still, that does not change the fact that Tesla is the first company to widely test a beta software inside vehicles without professional drivers and with very real consequences.

Earlier on Wednesday, regulators announced they were investigating whether a recent Tesla accident involved Autopilot.

This article originally appeared on Recode.net.

Sign up for the newsletter Sign up for The Weeds

Get our essential policy newsletter delivered Fridays.