/cdn.vox-cdn.com/uploads/chorus_image/image/47050270/GettyImages-144473615.0.0.jpg)
Last week, Matt McFarland of the Washington Post relayed an amusing anecdote about a cyclist on a fixie and one of Google's self-driving cars.
As the Austin, Texas, cyclist wrote on a biking forum, he recently came to a four-way stop sign a moment after the Google autonomous car. To let the car go first, he did a track stand: a maneuver riders of fixed-gear bikes often do to stand in place without dismounting, which requires turning the front wheel back and forth. This can cause the bike to slightly rock forward and back.
The robotic car began to go, but as it did, the cyclist moved forward about an inch. The car interpreted this as the cyclist proceeding, and its algorithm forced it to stop abruptly. The cyclist stopped, and after a moment, the car began to move again, but then another subtle movement from the cyclist froze it in its tracks.
"We repeated this little dance for about two full minutes and the car never made it past the middle of the intersection," the cyclist wrote. "The two guys inside were laughing and punching stuff into a laptop, I guess trying to modify some code to 'teach' the car something about how to deal with the situation."
It's tempting to interpret all this as a sign of the steep learning curve Google's cars will encounter as they drive more in the complex conditions of the real world. But I think it shows something quite different.
Engineers will probably be able to teach the cars to distinguish between track stands and real movement fairly easily. But the cars will continue to drive with extreme caution and sensitivity, which is absolutely great news for cyclists and pedestrians.
Human are utterly terrible drivers
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/4009452/shutterstock_221973010.jpg)
To date, Google's cars have traveled nearly 2 million miles in California and Texas, but have only been involved in 14 minor collisions — all of which were other drivers' fault. Google has detailed them, and they reveal the many alarming tendencies of human drivers.
Most of these crashes involve Google's cars being rear-ended through no fault of their own. As Google's Chris Urmson writes:
The most recent collision, during the evening rush hour on July 1, is a perfect example. One of our Lexus vehicles was driving autonomously towards an intersection in Mountain View, CA. The light was green, but traffic was backed up on the far side, so three cars, including ours, braked and came to a stop so as not to get stuck in the middle of the intersection. After we’d stopped, a car slammed into the back of us at 17 mph — and it hadn’t braked at all.
It's pretty clear what probably happened here: The driver was distracted, possibly looking down at a cellphone. If Google's car were a cyclist, he or she might be dead.
Cellphones, of course, are a particularly big problem for safe driving. But the truth is that humans are pretty bad drivers in all sorts of ways.
As another Google post detailed, its cars frequently spot people driving on the wrong side of the road, dangerously turning across several lanes of traffic, and proceeding through intersections when there are still cars or cyclists in them. Human drivers zone out, miss cars in their blind spots, and often fail to spot bikes and pedestrians. This is part of why more than 30,000 people die in traffic crashes in the US each year.
Put simply, if a tech company introduced the human as a product engineered to drive cars, it'd go out of business.
Google's cars don't get bored or distracted
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/4009432/GettyImages-152766337.jpg)
Google's cars aren't perfect yet. The real world is a very complicated driving environment, and the track stand story shows they still have a long way to go.
But when it comes to safety, they have the potential to outstrip human drivers in every way imaginable. Their algorithms don't get bored, tired, or angry, and their 360-degree laser sensors mean they don't have blind spots. Just as importantly, they're seemingly programmed to always err on the side of excessive caution.
As a Mountain View, California, resident — who frequently interacts with these cars on the roads near Google's headquarters — wrote in June, "Google cars drive like your grandma — they're never the first off the line at a stop light, they don't accelerate quickly, they don't speed, and they never take any chances with lane changes (cut people off, etc.)." The resident described how the cars wait a few seconds after a pedestrian has completely cleared a crosswalk before beginning to turn through it.
And Google is clearly taking cyclists and pedestrians into account in the design of their cars' algorithms. It's specifically upgraded its software to navigate chaotic city streets, and earlier this year it patented a way for its cars to interpret bikers' hand signals.
Anyone who's spent much time walking or biking in the US is familiar with the danger human drivers pose when you don't have the protection of a big metal shell around you. As someone who travels mainly by bike — and has experienced countless uncomfortably close passes and near-misses when drivers fail to see me — the idea of riding next to self-driving cars is way more appealing.