The problems: Police shootings, racial profiling, excessive force, corruption.
The solution: Robot cops.
At least, that’s what some people think. As self-driving cars become more common on the roadways, Ford has filed a patent for a law-enforcement version – a self-driving police car that could pull over speeders.
Robot cop champions say artificial intelligence (AI) will make policing less violent, more just, and less vulnerable to human error. The truth is more likely that the AI that would be used is nowhere near sophisticated enough to handle the nuances of modern policing.
And, shouldn’t human emotion be a part of effective policing?
Robot Cops Are Your Friend!
Theoretically, robot cops could eliminate a lot of these problems. A properly programmed robot theoretically wouldn’t decide which suspects to pull over – or shoot – because of the color of their skin. AI would be impervious to emotional appeals to get out of speeding tickets.
Robots don’t need money, so bribes wouldn’t lead them astray. And, a bad night’s sleep wouldn’t make a robot grumpy or more likely to use unnecessary force.
A robot would use an algorithm to decide whether to pull over a driver based solely on their driving behavior – not on the color of their skin, their fashion choices, or whether they “belong” in a particular neighborhood.
Robots could make traffic stops safer for everyone involved. Anytime police officers approach a suspect, they may experience an understandable fear of violence. That fear makes officers more likely to make poor decisions, use unnecessary force, and rely on previously existing biases when deciding how to interact with suspects.
At the same time, the civilian being pulled over would not have to worry about whether the approaching officer is a racist or a bully. The cop would be just a machine, devoid of emotion and free of biases … in theory.
Wait, Robot Cops Are Not Your Friend!
As computer scientists like to say, “Garbage in, garbage out.”
If the people programming robot cops have racial biases, then the machine could be designed to pull over minorities more often. In fact, any biases or weaknesses that humans suffer could easily be passed on to AI – intentionally or unintentionally.
A report by ProPublica in 2016 put a spotlight on exactly that kind of embedded bias in AI. In some U.S. courtrooms, for example, judges are using AI to predict which defendants are more likely to commit another crime.
If the software says you are likely to re-offend, you get a harsher sentence. While this AI is meant to eliminate racial bias, ProPublica found that it still often recommends stiffer sentences for black defendants because that’s how the software is written. Robot cops are almost certainly going to have similar issues in the data that they rely on.
SC Criminal Defense Lawyers in Myrtle Beach, Conway, Charleston, and Columbia
Robot policing is coming although there is no telling how far off it is. When it arrives, it will bring with it a slew of legal issues including Fourth Amendment issues, due process issues, right to counsel issues, and unanticipated problems that will almost certainly arise.
In the meantime, if you have been arrested by an old-fashioned human police officer, your Myrtle Beach criminal defense attorney at Coastal Law is here to help. Schedule a free consultation today to discuss the facts of your case by calling (843) 488-5000 or filling out our online form.