Democrats push back against AI crime reduction technology

In case you weren’t already familiar with them, ShotSpotter is a California company that produces gunfire detection systems for law enforcement agencies. Their equipment includes sensitive microphones that are placed around primarily urban areas, feeding audio information back to a centralized computer that employs artificial intelligence to pick out gunshots from all the other noises of the city and alert the police as to the location of the shots. These systems are not cheap, so Congress previously allocated funding to the Department of Justice to award grants to cities seeking financial aid to install their own ShotSpotter systems. But now, Oregon Democratic Senator Ron Wyden is calling for an investigation into those grants. Why? Because, like everything else these days, ShotSpotter is supposedly racist. Insert your own facepalm meme here. (Associated Press)

A Democratic senator said the U.S. Justice Department needs to look into whether the algorithm-powered police technologies it funds contribute to racial bias in law enforcement and lead to wrongful arrests.

Sen. Ron Wyden, an Oregon Democrat, was responding to an investigation by The Associated Press published Thursday about the possibility of bias in courtroom evidence produced by an algorithm-powered gunshot detection technology called ShotSpotter. The system, which can be funded by Justice Department grants, is used by law enforcement in more than 110 U.S. communities to detect gunfire and respond to crime scenes faster.

Wyden stated, “algorithms and technologies used during investigations, like ShotSpotter, can further racial biases and increase the potential for sending innocent people to prison.”

Before addressing the silliness of these claims, let’s revisit a few basic facts about the technology behind ShotSpotter and similar systems. There are no cameras involved nor any sort of facial recognition. The system uses microphones that record sounds. ShotSpotter has no idea who might be in the area where it detects gunshots nor what color their skin might be. It’s just listening for gunfire. And given how densely populated and developed most of our larger cities are and the ubiquitous laws we have against discharging a firearm within 500 feet of a building (more, in some cases), if you’re hearing gunfire in a city that’s not coming from a licensed firing range, there’s probably a need for the police to show up.

Wyden and his colleagues point to cases such as that of 65-year-old Chicago resident Michael Williams who was charged with murder in a case where ShotSpotter alerted police to gunfire. He spent nearly a year in jail before his case was dismissed when prosecutors said they had insufficient evidence. But that wasn’t the fault of ShotSpotter. It’s not able to issue arrest warrants. It alerted the Chicago PD to the fact that shots were detected. If they got the wrong guy, that’s on the investigating officers. Or perhaps they had the right guy but couldn’t build a strong enough case.

These complaints are yet another case of people crying racism against the criminal justice system based on it producing results they don’t like. In most major cities, the highest concentration of illegal gunfire is going to show up in neighborhoods where there is the most gang activity and highest crime levels. Sadly, those tend to be majority-minority neighborhoods in most cities, so obviously you’re going to see disproportionate numbers along racial lines.

I was just discussing this during a segment I participated in on a Washington, D.C. radio station yesterday. We were talking about the judge who declared a federal deportation law unconstitutional. Her reasoning was that a disproportionate number of Mexican and Hispanic suspects were being deported. That’s a shocking revelation when the vast majority of people coming into the country illegally are coming from Mexico, right?

To get our growing crime problems under control, we need to focus on making sure that we’re finding the correct, guilty parties and putting them on trial. Focusing on how many of them tick which demographic boxes only weakens the system and makes the country less safe.

View Original Source Source