How Is the Use of Predictive Analytics in the Criminal Justice System Negatively Impacting Black Defendants?

Henry Awere
The Startup
Published in
5 min readJul 28, 2020

--

Introduction

There has been an increasing trend in the public sector to use predictive analytics to predict and solve some of society’s biggest policy challenges. Governments are using predictive analytics tools to try to help build more effective programs and policies before they are approved and funded. The healthcare industry uses machine learning to detect heart attacks and Alzheimer’s diseases before its onset — In the Criminal Justice System (CJS), courts are partnering with private and public sector organizations to develop risk assessment tools to produce new ways of thinking about and managing risk — and assessing recidivism.

The use of predictive analytics tools in the CJS has been widely supported as a new solution at sentencing “meant to improve the allocation of resources through objective, consistent, and neutral assessment — and helps to reduce or eliminate human errors and bipartisan interest in criminal justice reform” (Eagin, 2017). While some may see these technological advancements in the criminal justice system as progressive, they also reflect an ominous trend towards digital monitoring and decision making without regulatory oversight on private companies, ‘who build and sell proprietary predictive models using confidential datasets”(Rudy and Ustun, 2015).

Problem/Issue Under Investigation

The introduction of predictive analytics technologies in the CJS requires careful analysis because the concept of risk is “central to our legal and criminal justice culture”, and because these technologies could further exacerbate the inequities within CJS and further criminalize communities of colour. The policy challenge for lawmakers is how are these private companies going to be regulated to ensure the data being produced, is reliable, because “machine learning models are only as reliable as the data they’re trained on — If the underlying data is biased in any form, structural inequalities and unfair biases will be replicated, and historically we know that people of color, especially low-income minorities communities have been disproportionately targeted by law enforcement, resulting in more contact with the CJS.

It is widely known that most facial recognition technologies algorithms target criminal suspects based on skin color — algorithms used to determine credit score disproportionately identify black people as high risks and prevent them from buying homes, getting loans, or finding jobs. Automated risk profiling systems are designed to disproportionately identify Latino people as illegal immigrants. danah boyd who is Principal Researcher at Microsoft Research and the founder of Data & Society noted this shift, saying “it’s no longer about what you ‘do’ but about what you ‘might do’, and that includes what other’s do where it implicates you or might influence you.”

Lack of Accountability and Transparency with how data is collected

Research conducted by Duke University and MIT entitled: ‘Optimized Scoring Systems: Towards Trust in Machine Learning for Healthcare and Criminal Justice’ found, that in the CJS, “proprietary predictive models can potentially lead to decisions that may violate due process or that may discriminate based on race or gender”. Similarly, research conducted by Jessica Eaglin called ‘Predictive Analytics: Punishment Mismatch’ noted, “successful application of the information produced to individual sentences demands critical reflection about how the actuarial tools are constructed. Otherwise, predictive analytics can create mismatches between what the tools do and the aims of the society”. Hannah-Moffat (2018) found big datasets can act as a ‘black box’ because “tools are rarely transparent, and their internal mechanics are not typically shared by companies, agencies or governments that own and develop the algorithms”.

Various local and state governments in the U.S have continued to partner with private company COMPAS to use its recidivism prediction model, despite public outcry about the lack of transparency and accountability with it tools — compounding to this problem is the fact that the tools used by COMPAS may discriminate based on race. “There have been cases such as that of Glen Rodriguez, a prisoner with a nearly perfect record, who was denied parole as a result of an incorrectly calculated COMPAS score, with little recourse to argue or even to determine how his score was computed”. Similar issues have led to the European Union enacting new regulations called ‘right to explanation, which allows individuals to receive explanations for decisions made about them by algorithms.

Various Canadian police agencies have also started using predictive analytic tools. Research conducted by Akawsi Owusu Bempah and Daniell Kornoff from the University Toronto through the Broadbent Institute entitled: Big Data and Criminal Justice — What Canadians Should know’ ‘found, that “cities such as Vancouver, British Columbia and London, Ontario have adopted predictive policing software within the last five years. The City of Ottawa police agencies have also adopted predictive analytics tools. The Police’s Strategic Operations Centre — which monitors protests on social media — as an example of how Canadian police are using big data and predictive policing”. Similarly in Ontario and Saskatchewan police agencies have been using a “Riskdriven Tracking Database” (RTD), “which combines information collected by the police, schools, social workers, and other community agencies to track “negative behaviour”, identify potentially at-risk people, and to deploy resources for “proactive intervention.”

Conclusion

In a Post-COVID world, where governments and public sector agencies around the world will be looking to use technologies more in the decision-making process to reduce cost, law and policymakers must understand the risk associated with using proprietary datasets in the CJS without proper government oversight or how the data is aggregated — if these policy issues are not addressed promptly, it will likely reinforce racial bias within the CJS towards people of color.

--

--

Henry Awere
The Startup

Henry Awere is the Founder of Strategic Consulting Inc. He holds a Master's degree in Public Policy and a Postgraduate Certificate in Cyber Security.