/cdn.vox-cdn.com/uploads/chorus_image/image/50329139/GettyImages-523140816.0.0.jpg)
This summer, the fatal shootings of unarmed black men (in St. Paul, Minnesota, Baton Rouge, and elsewhere) and of police officers (in Dallas and Baton Rouge) reenergized the national debate surrounding race relations and the use of force by police in the United States. Yet amid the now all-too-familiar characters – unarmed black men, armed police officers, bystanders with cameras — emerged a new one: a robot with the power to kill.
Micah Xavier Johnson, the Dallas shooter, will go down in history as a domestic terrorist who killed five officers and wounded nine others. He will also go down in history as the first person killed by an armed police robot. Johnson was killed in a standoff when Dallas police sent in a remote operated Remotec bomb disposal robot that had been jury-rigged to carry a pound of C-4 plastic explosives.
It is easy to Monday-morning-quarterback the judgment call of SWAT team tactics in Dallas. On one hand, there was an active shooter who had just murdered five police officers protecting a peaceful protest. "Other options would have exposed our officers to grave danger," Dallas Police Chief David Brown said at a subsequent news conference.
On the other hand, the shooter was contained and potentially could have been waited out or incapacitated — possibly even by a robot carrying a nonlethal weapon, such as a "flash bang" grenade.
But to get caught up in that narrow discussion about the tactics in one encounter is not just to second-guess those on the scene but to miss the bigger issue. Robotics are here to stay, and the questions over their use will only grow more pressing. What’s more, the discussions we need to have tie back to the societal debates over policing and race relations that got us here in the first place.
Robocops aren’t the future; they’re here
Robocops and robosoldiers may seem like science fiction, but they are already a reality. Some 10,000 unmanned aerial systems (a.k.a. drones) and another 12,000 unmanned ground vehicles now serve in the US military. The US is hardly alone; more than 80 other countries also rely on robotics in their military.
Ground-based robots can be car-size, or they can be beer can–size "throw bots" that can be tossed through a window; upon landing, these bots then crawl about, inspecting their surroundings. The system used in Dallas, a Remotec Mark V, weighs just under 800 pounds and is about the size of a large lawnmower.
Despite this diversity, these robots primarily take on two roles: surveillance and explosive-ordnance disposal. That is, they either gather information or help defeat bombs (or both). A system like the Remotec robot might creep up close to a suspicious package. If it turns out to be an explosive, the human operators might defeat the bomb by having the robot tear it apart, shoot high pressure water into it, or even place a small explosive beside it to blow it up in a controlled manner.
As this tech proved itself in Iraq and Afghanistan, it also spread to police forces. Hundreds of robotic systems are now used by bomb squads and SWAT teams in much the same way by nearly every major city and many minor police departments — just as these departments also use surveillance drones.
This technology is becoming become more and more common in the civilian world outside of police departments, too. Driverless vehicles are under development at traditional car companies like Ford and Toyota as well as at tech companies like Google, Uber, and Tesla, while an autonomous delivery robot was just authorized to drive on Washington, DC’s sidewalks. Designed by the Estonian robotics company Starship Technologies, the robot can carry the equivalent of roughly two shopping bags full of anything you want.
As with drones, technology is outpacing public policy and the law
In all these areas, however, the tech has moved faster than the public policy. Without any congressional debate, we began using robotics to carry out strikes into countries with which we were not at war. This meant we’ve conducted more than 500 drone strikes into Pakistan under rules of engagement only released by the White House years after the fact. So, too, in Dallas, the decision to use the robot to kill the gunman didn’t reflect any set policy or doctrine but was made in the midst of the standoff.
On the civilian side, we’ve seen our streets begun to be treated as the sites for "beta tests," where robotic tech gets pushed out for customers to vet and improve, just as with any other app. The difference in the case of robotics is that we’re talking about physical objects, not just software. If things go wrong, people can get hurt or even killed — as when a Tesla car, driving in "autopilot" mode, crashed into a truck in Florida. (The cause of that crash remains under investigation.)
There is no reason to reject robotics out of hand in any of these areas. Robots can save money, time, and lives. Military drones mean we don’t have to put pilots in harm’s way; the Dallas Police Department’s robot meant officers weren’t exposed to a heavily armed gunman; the same system involved in the fatal Tesla crash has no doubt helped scores of other drivers avoid potentially deadly accidents.
So the question is less whether we want robotics involved across various areas of life than how we want them involved. And where police and robots are concerned, the debate cannot be separated from the heated discussions about what role we want police to play in society.
Think of the debate over robots as a variant on the controversy surrounding the program through which police forces received surplus military gear. This gear ranged from fax machines and ammunition to up-armored mine-resistant, ambush-protected (MRAP) trucks originally designed for the streets of Iraq. These massive military vehicles were not designed for civilian patrolling, and, as we saw in Ferguson, Missouri, their intimidating presence can be counterproductive, stirring up animosity rather than quelling unrest.
Prior to the incident in Dallas, the robotic systems that police forces had acquired, on their own or through the military, have not been lethally armed. Much as the police don’t get surplus armed Apache helicopters, they weren’t a destination for systems like the prototype Modular Advanced Armed Robotic System (MAARS), a robot armed with a machine gun.
Where there has been a push to arm robotic policing systems, it has been with what are called "less than lethal" weapons. At trade shows, manufacturers show off prototype policing robots armed with everything from tear gas to Tasers to shotguns firing rubber bullets to — in one case — an electric cattle prod (this was a Chinese riot control robot).
The paradox of "less than lethal" technology
One irony of the Dallas incident is that a frequently cited advantage of these robots was that because an officer’s life is not at risk, less force may be needed in order to subdue a suspect. It didn’t turn out that way in Dallas. The police department applied the lethal force rules it would have used had a SWAT team moved in.
"Less than lethal" technology can also be prone to misuse and abuse; it can even affect the psychology of those who wield it, inducing them to turn to force too quickly.
This is where the policy vacuum over robotics and the broader challenges of policing in America are going to come crashing together, connecting the story of new technology to the older debate over community policing. Just as there are a wide variety of ways robotics might be used, there is also a disturbing variability in how police departments around the nation are handling issues that range from training of officers to use of force to civilian relations in general, especially with the African-American community.
In Minnesota, where 32-year-old Philando Castile was killed during a traffic stop, law enforcement experts have found it notable that the officer who shot him had taken a training course that emphasizes a "warrior" mentality as opposed to the philosophy of deescalation that the Department of Justice advocates.
In Baton Rouge, where Alton Sterling was shot, police reacted to protests by donning so much riot gear that Brandon Friedman, a military veteran of the 101st Airborne, commented, "Baton Rouge PD looks ridiculous. I never wore so much armor in combat. This is their own community." Police then repeatedly charged protesters, making an already angry dynamic worse.
In contrast, Dallas’s police department has been a national model for reform and best practices. The year before David Brown took over as police chief, the department had 147 excessive force complaints. Last year it had 13.
In contrast to the confrontational and aggressive approach that characterized cities where police-civilian relations have turned poisonous, at the peaceful protest in Dallas, police moved freely and jovially among the crowd. And when bullets rang out, police moved to protect the protesters and, in turn, the protesters aided the police in finding the shooter.
This is why the images that went viral in the aftermath there included ones of Black Lives Matters protesters saluting the bravery of fallen police, rather than, say, an unarmed woman in a dress being grabbed by police in riot gear.
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/6912329/ferguson.mo.jpg)
This variability in culture and training in police departments reflects everything from the differing training regimens and points of emphasis in police academies around the nation to their differing budgets to the differing legacies of their local histories.
And this same dynamic is precisely why we need to be concerned with the absence of policy in the robotics space. There are all sorts of looming questions: What kinds of robotic systems should be available to law enforcement? Should they be armed, and if so, with what level of weaponry? What kind of training should then be required? Under what conditions should they be used?
There is no simple answer to any of these questions. And yet, we are now setting ourselves up for the same dynamic that has played out in police training, use-of-force questions, and police-civilian interactions that helped spark the events of this summer that so shocked and divided us. Simply put, left to their own devices, some police departments are going to handle these tough questions well, and some are not.
Robots are here to stay. They’re only going to get smarter, and they’re going to be used more often. And now they’ve been used to kill. We can seize this moment by working to establish national policies and legal norms on robotics and their use in policing. The discussion will have to involve the multiple entities that weigh in on both technology policy and policing in our country — including the Department of Justice, Congress, police academy and training organizations, the courts, and the public that will be protected and served (or not) by such systems.
The issues that loom with robotics and policing are complex. But when it comes to the big picture, we do have a simple choice. We can start to wrestle with the questions of what we as a society are willing to accept.
Or we can just let the Ferguson Police Department figure it out on its own.
Peter Warren Singer is a strategist and senior fellow at New America and the author of multiple books on technology and security. Emefa Addo Agawu is a program associate at New America.