Robotics and artificial intelligence (AI) promise to revolutionize almost every field of human endeavor, and law enforcement is certainly no exception. And not just in the labs and interrogation rooms, but on the front line of policing. With the prospect of robot police officers hitting our streets, it's worth taking a moment to look at the current state of affairs, where we're headed, and whether it's a good idea to give robots guns.
Thirty years ago, the science fiction film RoboCop featured a memorable scene in which a giant mega-corporation reveals a law-enforcement droid called ED-209. Introduced as "the future of law enforcement," the intimidating robot's boardroom debut was an unmitigated disaster when the robot failed to recognize when its identified threat had dropped his gun, resulting in a violently effective demonstration of why lethal, autonomous police robots are potentially a terrible idea.
Fast forward to 2017 and we see fiction slowly becoming reality, albeit in a classically incremental way. In recent months we've seen the first ever robotic police officers being deployed across China and in Dubai. These initial robocops are essentially glorified security guards or mobile information touch screens, tasked with passing general information onto citizens.
The Chinese robot, called AnBot, is a bit like a cross between R2D2 and a Dalek. Initially rolled out into airports and train stations, the robot has facial recognition technology that can track potential criminals and forward information to a central command manned by human operators.
Over in Dubai they went for a more humanoid-looking robot, although it is still restricted to similar menial tasks. These machines are the first generation of robotic police, and they most certainly will not be the last. The Dubai Police has already announced its intention to develop a more mobile robot by 2020, and it aims to have a human-free police station manned completely by robots by 2030.
While robots are increasingly replacing humans in jobs that require menial or repetitive tasks, it is not surprising to see this first wave of police robots doing more banal administrative work.
In 2015, the Democratic Republic of Congo set up a series of bizarre traffic directing robots. These bulky, Transformer-like robots were ultimately just supercharged traffic lights with surveillance cameras that could send real time data to police stations. The idea was that by incorporating surveillance into a humanoid shape, motorists would feel more pressure to obey road rules. Strangely enough, the initiative appears to have worked, with locals responding positively to their new giant traffic-directing overlords.
"There are certain drivers who don't respect the traffic police," One taxi driver reported to AFP. "But with the robot it will be different. We should respect the robot."
Are we giving robots guns now?
In 2016, Dallas police strapped a pound of C-4 to a bomb-disposal robot and sent it toward a sniper that had been targeting officers. Similar improvisation tactics had been deployed by the military for some time, but this was the first time in the United States that a police force had utilized a lethally armed robot to take down a criminal.
Needless to say, the move was controversial, opening up a Pandora's box of "slippery slope" commentaries questioning how far this kind of robotically controlled lethal force could be taken. Are we giving robots guns now? Well… yes.
Russia's Deputy Prime Minister Dmitry Rogozin recently posted a startling video on social media showing a gun-toting robot being trained to fire two pistols simultaneously. Rogozin followed up the admittedly impressive footage with the claim that Russia wasn't trying to create a Terminator, but guns were being used as a "way of teaching machines to allocate priorities and instantly make decisions."
At the risk of sounding paranoid, anyone educated on a healthy volume of dystopian science fiction should know that when someone says they are not trying to create a Terminator, that is exactly when they do create a Terminator.
Less hyperbolically, a more modest gun-toting bot has been developed by an Israel-Based robotics firm. Called Dogo, this is a small, Roomba-like robot with a built in 9 mm Glock pistol. Marketed to SWAT, law-enforcement and first responders, Dogo offers live video surveillance and a trademarked "Point and Shoot" interface that can "quickly acquire the target with a simple touch of the screen."
The introduction of a bot like Dogo into the law enforcement robotics market pushes us way past any questions about whether we should be arming robots. The concern that needs to be asked now is whether the deployment of a robot with weapons should ever be autonomous?
Autonomous sentry robots
At this very moment, a series of sentry robots dubbed SGR-A1 are posted along the South Korean border facing North Korea. Initial prototypes of these machine-gun wielding robots developed jointly between Samsung and Korea University were produced in 2006. SGR-A1 has sophisticated object and pattern recognition systems and is programmed to project a loud audio warning when it detects any unidentified persons approaching.
As these robots are stationed facing the DMZ on the border between North and South Korea they are set to target any human approaching. If the human cannot provide a voice access code, the robot can sound an alarm or shoot its machine guns. Unsurprisingly, there has been a great deal of controversy surrounding just how much autonomy the robot has over discharging its weapons.
South Korean officials have claimed that the robot cannot fire its weapons without human intervention, although there are several media reports that dispute this, claiming that the system does have an autonomous mode where the machine can decide to fire without human approval.
The SGR-A1 feels like a robot right out of the ED-209 playbook. These types of gut-toting autonomous robots could be operating right now in tense stand-offs like the DMZ in Korea, but it's hard to see them appearing on our urban streets anytime soon. The idea of an autonomous weapon-wielding robocop could be a little way off, but if we take autonomy out of the equation the picture changes dramatically.
In 2012, a lieutenant commander in the U.S. Navy Reserves joined forces with a team of researchers at Florida International University (FIU).The plan was to create a telerobotics system that would allow disabled law enforcement officers back onto the streets, after a fashion. Much like in the film Avatar, the idea was that a custom-built robot be designed that is controlled remotely by the disabled officer. The so-called "PatrolBot" would be able to mimic the movements of its controller who comfortably directs it via a VR headset from the safety of the police station.
It's a remarkable idea and obviously quite some time from being realistically rolled out into the real world, but the FIU team did create a working, albeit boxy, prototype. At this stage it looks more like the robot from Lost In Space than a sophisticated Robocop, although as a proof of concept it is a fascinating idea. All the benefits of a robocop, while keeping human police officers out of danger yet still responsible for making decision.
Of course, we would hope the control mechanisms of such robots are not easily hackable. System security being the huge elephant in the room if we are talking about giving robots guns and having them patrol our streets.
The future of policing is undoubtedly going to be as touched by technology as every other field of human endeavor, but in this area it is worth treading lightly. Unlike that factory bot slipping onto a production line, these robocops will be wandering our neighborhoods and potentially wielding weapons.
Police all over the world are being armed with these new flying surveillance and in some cases weapons platforms.
Drones are already half-human half-robot (yes, the human sort-of flies them, but a complex mix of computers, AI, and object recognition are doing much more control than the human already). When they follow or fire, it is already the AI doing the targeting; the human merely selected the target... so far anyhow.
Instead study police forces in countries that use techniques based in the psychology of conflict deescalation to achieve a much more peaceful outcome.
Now there are those who say that say this is nanby-pamby and wimpy,
However the countries employing these techniques have a jail recidivism rate of 12 to 15% versus the US rate of over 60%.
So wht choose a system that has a five times higher failure rate?
In another time and place, this would raise serious objection. Too late now ... (See below).