top of page

Algorithmic Discrimination in Predictive Policing 
#NEIGHBORSNOTNUMBERs

Algorithms are increasingly being used by police departments and for law enforcement purposes. Police use them to try to predict where crimes may occur. Put simply, these programs use historical data to try to predict where future crimes may occur.

Stories are Great, but Who's Telling Yours?

But the historical data can be based on racist policing practices. Algorithms programmed using that data may direct police to target the same communities they have harassed in the past. And what happens when the predictions are wrong?
But we're getting ahead of ourselves; Even before we ask questions about failure rates or the successes of these tools, are these algorithms even constitutional?
Below, IDH Student Researcher Maddie Britto Explores the constitutionality of an algorithm implemented in Chicago

Predictive Policing In Chicago: Targeting Innocents?

In Chicago, police have tried to curb gun violence by using an algorithm called the Strategic Subject list or S.S.L. The algorithm was developed by Illinois Institute of Technology in 2013 in an attempt to maximize proficiency for the limited police resources. However, recent research and data suggest that the S.S.L. algorithm has been a tool for the Chicago police department police departments across the country to target individuals who have not yet committed a crime. The 4th amendment protects individuals from unreasonable searches and seizures and cases of predictive policing will have significant influence on these rights for three key reasons.   
First,
predictive policing uses data and assigns risk scores to individuals and neighborhoods, which may effectively make the burden of reasonable suspicion lower and justify more police stops.  Predictive policing systems digitally redline certain neighborhoods as “hotspots” for crime.   Predictive policing algorithms are inherently flawed by discrimination. This new category includes data that is derived from or influenced by corrupt, biased, and unlawful practices, including data that has been intentionally manipulated or "juked," as well as data that is distorted by individual and societal biases. It also begs the question: should being a high risk individual be grounds for police surveillance? Many scholars -- and citizens, argue no, because it violates numerous elements of the 4th and 5th Amendments. For these reasons, many believe that the new era of predictive policing interferes with constitutional rights and threatens the justice system. 

Policing AI Harms
bottom of page