clock menu more-arrow no yes mobile

Filed under:

Silicon Valley likes to "move fast and break things." What happens when it makes cars?

Uber Experiments With Driverless Cars Photo by Jeff Swensen/Getty Images

Mark Zuckerberg popularized the slogan “move fast and break things” to describe the breakneck pace of innovation at Facebook, and the phrase has become popular across Silicon Valley. So what happens when technology companies start to build a technology — self-driving cars — that can literally move fast and break not just things but people?

It’s a crucial question not only for the major Silicon Valley companies working on self-driving technology — including Google, Uber, and Tesla — but also for regulators. The balance is tricky: If regulators are too lax, people could die from malfunctioning self-driving vehicles. But overregulation could delay the introduction of cars that drive themselves much better than a human driver, costing many more lives in the long run.

So is it time for federal and state regulators to crack down on the self-driving free-for-all? There’s definitely more that officials could be doing. At a minimum, the federal agency in charge of car safety, the National Highway Traffic Safety Administration, could use additional resources to hire more technologists and build the infrastructure they’ll need to adequately monitor the industry’s evolution.

“I've felt for a while that things have been a little bit out of control” on the regulatory front, industry analyst Edward Niedermeyer told me in an interview.

A lot of self-driving car regulations are enforced at the state level, and technology companies have become adept at playing states against each other. In much of the US, “autonomous driving is a free-for-all,” writes Backchannel’s Mark Harris. “Uber taxis are transporting passengers in Pittsburgh, Google’s self-driving prototypes are criss-crossing Texas, and Tesla’s cars are taking over the wheel nationwide, with little official testing or licensing of the technology beforehand.”

But tightening regulation too much could quite literally cost lives. Human-driven cars kill 100 Americans per day, on average. Once self-driving technology is perfected, it could dramatically reduce that death toll, saving thousands of lives every year.

“Am I concerned about self-driving cars? Yes,” says Bryant Walker Smith, a law professor at the University of South Carolina. “But I'm terrified about today's drivers.”

Uber has repeatedly clashed with regulators

Vanity Fair New Establishment Summit - Day 1
Uber CEO Travis Kalanick.
Photo by Mike Windle/Getty Images for Vanity Fair

Right now this debate is largely about Uber, a company that has a history of pushing the law to its limits and even misleading regulators.

Uber’s culture is “Silicon Valley on steroids,” Niedermeyer told me. Uber has a reputation as a company with little concern for coloring inside the lines. For years, it has flouted municipal taxi regulations and dared local officials to do anything about it. A recent New York Times story reported that Uber even created a secret, fake version of its app to mislead city officials trying to enforce taxi laws against Uber drivers.

Uber has taken this same confrontational approach when it comes to self-driving car regulations.

In December, the San Francisco Examiner reported that a self-driving Uber car ran a red light in San Francisco. Uber blamed the car’s human driver, telling the New York Times that “this is why we believe so much in making the roads safer by building self-driving Ubers.”

But two anonymous sources at Uber told the Times that the car had actually been under software control at the time the car ran the red light. A software bug, not human error, had been responsible for the car running the red light, the Times reported.

San Francisco cyclists also complained that Uber’s cars were making unsafe right turns across bike lanes, cutting off cyclists and potentially putting their lives in danger.

Legally speaking, Uber’s cars probably shouldn’t have been on California streets at all. At least that’s Smith’s view. California law requires Uber to obtain a license before it operates self-driving cars on the streets. Uber didn’t get one.

Uber argued that since it planned to have a human driver behind the wheel at all times, it wasn’t affected by the law. But California officials disagreed, and in December Uber was forced to ship its cars out of the state to Arizona, where state officials were more welcoming. The company finally patched things up with California regulators earlier this month.

“These companies hold all the cards”

An Otto self-driving truck delivers beer during a test run in Colorado.
Otto

These questions, in many ways, outstrip the current framework for how America regulates auto safety. Federal and state regulators share jurisdiction over car safety regulations. Traditionally, federal regulators oversaw car design while state officials licensed drivers. But in a self-driving world, there’s no longer a clear distinction between the two: The “driver” is a computer program running inside the car.

So far, the National Highway Traffic Safety Administration, the lead federal regulator, has taken a hands-off approach, effectively punting many important questions to the states. Uber and other technology companies have learned to play states against one another to get more favorable regulation.

Niedermeyer is troubled by this dynamic. “You have states and cities competing with each other in order to become a hub for this new technology,” he said. “These companies hold all the cards, the local governments are doing whatever they can to attract them.”

Otto is a self-driving truck startup that Uber bought last year. When it made its first public announcement last May, Otto tested a prototype self-driving truck in Nevada without getting the testing license required by Nevada law. When Nevada pressured Otto to come into compliance, Otto instead left the state, performing its next public demonstration in Colorado instead.

As the largest state and the home of a number of technology companies, California seems to be the only state with the clout to insist on companies following their safety regulations. Other states worry that if they demand too much, the self-driving revolution will pass them by.

Too much regulation would be worse than too little

AAA Predicts The Busiest Thanksgiving Travel Period In Nine Years Photo by Scott Olson/Getty Images

It seems inevitable that lax regulation of self-driving cars will lead to some preventable deaths. Still, there’s a good argument that today’s permissive regulatory environment is the best approach.

The reason: While self-driving cars are potentially dangerous, human drivers are definitely dangerous.

“It's so easy to immediately focus on self-driving cars as the new and the scary and forget that every day 100 people die on the road,” Smith said. He says that about 90 percent of those fatalities are caused by human error — errors that self-driving cars could avoid some day.

We don’t have enough data yet to say how today’s self-driving cars compare with today’s human drivers. What does seem likely, however, is that self-driving software will steadily get better over time, while human drivers won’t. And the sooner we reach the point where computers are safer than human drivers, the sooner we can start saving lives.

In a sense, then, one of the big dangers of introducing self-driving cars before they’re ready is the risk that it could trigger a broader public backlash against self-driving technology. If self-driving cars kill a few people and trigger a public backlash, regulators could be forced to institute a far more draconian testing and licensing regime that could delay the introduction of self-driving cars by months or even years.

And this could be a particularly big problem if a fatal crash is caused by a company like Uber with a reputation for cutting corners. A string of scandals focused on everything from workplace sexism to callous treatment of drivers has trained people to expect the worst from the company. So if a self-driving Uber car kills someone, it’s likely to trigger a broader public backlash that could have negative consequences not only for Uber but for the broader industry.