Why Predictive Policing Empowers Mass Incarceration in the 21st Century

By: Nadia Chung

2nd place

In a legal system predicated on due process of law and presumption of innocence, choosing expediency over justice inherently threatens individual liberties. This has been the case since the onset of the War on Drugs in the 1970s. It was irrefutably easier to advance a crusade against Black communities than face America’s racially prejudiced underpinnings. Thus, mass incarceration rose swiftly with the creation of policies such as the 1984 Comprehensive Crime Control and Safe Streets Act and the Anti-Drug Abuse Act. [1] The history of mass incarceration spans injustices from the subjugation of minority communities to oppressive laws created to worsen sentencing and prison conditions. Yet perhaps most troubling to citizens’ civil liberties and civil rights is an issue that has surfaced with the 21st century: the incipient use of AI technology to even further expedite an already exploitative system. 

In using algorithms to analyze data of prior crimes as a means to identify targets for police intervention, predictive policing aims to prevent crime by establishing when, where, and by whom future crimes might occur. Police departments use this data and its indication of “hotspots” to determine where to send officers, and often, who the algorithm expects will commit crimes.

Though predictive policing algorithms are frequently defended on the grounds that computers are less prejudiced than humans–and can therefore mitigate some element of bias in determining which areas to police–the reality is that the algorithms amplify systemic biases under a guise of objectivity. Such software relies upon gender, race, socioeconomic status, and location to determine which people or areas meet “the profile of a criminal”. This methodology creates feedback loops that reinforce themselves over time.[2] When police are constantly sent to the same neighborhoods, those will be the neighborhoods where police see crime the most simply because they happen to be present. Thus, the algorithm learns to send more police to those specific neighborhoods. Under-policed regions, in comparison, will not have been policed enough for the algorithm to self-correct or recognize similarities in crime rates. [3] In United States v. Curry, 965 F.3d 313, this disparity was found to both encroach upon Fourth Amendment rights of those who live in hotspots and to further racial bias and profiling in the criminal justice system.” [4] When predictive policing algorithms target marginalized communities in this manner, using this software means encroaching upon both The Equal Protection Clause and the Due Process Clause of the 14th amendment. 

Beyond even their discriminatory impacts, the use of predictive policing threatens to exacerbate mass incarceration because of its inaccuracy. Using the risk scores assigned to 7,000 Floridians, an evaluation of Broward County’s predictive policing algorithm found that “Only 20 percent of the people predicted to commit violent crimes actually went on to do so.”[5] Further, in regards to the tool as a predictor of recidivism, “the algorithm was somewhat more accurate than a coin flip”.[6] Despite the algorithm being statistically wrong approximately as often as it is correct, the tool continues to be used to implicate individuals for crimes and target innocent individuals for arrest. 

In 2011, a Santa Cruz Police Department crime analyst claimed, “The worst-case scenario is that it doesn’t work and we’re no worse off.” [6] Now, however, that sentiment could not be less true. After years of biased algorithms informing police to unjustly target Black and Hispanic communities, predictive policing has effectively normalized the act of racially motivated over-policing. As such, the weaponization of AI has provided an entirely new means of perpetuating mass incarceration. 

If the criminal justice system continues to use predictive policing technology as a crutch to avoid the inconveniences of properly employing resources, prioritizing efficiency over justice, America will find itself once again repeating the tragedies of mass incarceration in the late 20th century. However, a technology powerful enough to oppress, must also have the capacity to mobilize. In Brennan Center for Justice v. NYPD, a Freedom of Information Law request resulted in the NYPD disclosing documents revealing the “mechanics of predictive policing”. [7] Increasing transparency of these algorithms offers a legitimate path to potentially repurposing the software into algorithms that provide insight towards where to allocate rehabilitation and community-based initiatives. Ultimately, it is not the technology itself which impedes individual liberties and threatens a perpetuation of mass incarceration; rather, it is the weaponization of algorithms which undermines the sanctity of the justice system. 

NOTES:

  1. Cohen, Andrew, Kim Taylor-Thompson, and Rahsaan “New York” Thomas. “Race, Mass Incarceration, and the Disastrous War on Drugs.” Brennan Center for Justice, May 17, 2021. https://www.brennancenter.org/our-work/analysis-opinion/race-mass-incarceration-and-disastrous-war-drugs. 

  2. O'Donnell, Renata M. CHALLENGING RACIST PREDICTIVE POLICING ALGORITHMS UNDER THE EQUAL PROTECTION CLAUSE, n.d. https://www.nyulawreview.org/wp-content/uploads/2019/06/NYULawReview-94-3-ODonnell.pdf. 

  3. Ibid.

  4. United States v. Curry, 965 F.3d 313 (4th Cir. 2020)(en banc)(8-6 decision)

  5. Julia Angwin, Jeff Larson. “Machine Bias.” ProPublica, May 23, 2016. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing. 

  6. Goode, Erica. “Sending the Police before There's a Crime.” The New York Times. The New York Times, August 16, 2011. https://www.nytimes.com/2011/08/16/us/16police.html. 

  7. “Brennan Center for Justice v. New York Police Department.” Brennan Center for Justice, October 4, 2019. https://www.brennancenter.org/our-work/court-cases/brennan-center-justice-v-new-york-police-department. 

BIBLIOGRAPHY:

“Brennan Center for Justice v. New York Police Department.” Brennan Center for Justice, October 4, 2019. https://www.brennancenter.org/our-work/court-cases/brennan-center-justice-v-new-york-police-department. 

Cohen, Andrew, Kim Taylor-Thompson, and Rahsaan “New York” Thomas. “Race, Mass Incarceration, and the Disastrous War on Drugs.” Brennan Center for Justice, May 17, 2021. https://www.brennancenter.org/our-work/analysis-opinion/race-mass-incarceration-and-disastrous-war-drugs. 

Goode, Erica. “Sending the Police before There's a Crime.” The New York Times. The New York Times, August 16, 2011. https://www.nytimes.com/2011/08/16/us/16police.html. 

Julia Angwin, Jeff Larson. “Machine Bias.” ProPublica, May 23, 2016. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing. 

O'Donnell, Renata M. CHALLENGING RACIST PREDICTIVE POLICING ALGORITHMS UNDER THE EQUAL PROTECTION CLAUSE, n.d. https://www.nyulawreview.org/wp-content/uploads/2019/06/NYULawReview-94-3-ODonnell.pdf. 

United States v. Curry, 965 F.3d 313 (4th Cir. 2020)(en banc)(8-6 decision)