Criminal justice software algorithm used across the US is biased against black inmates, study finds
A tool used to figure out how likely a prisoner is to reoffend is skewed favourably towards white defendants
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.An algorithm that is used across the US to decide how likely prisoners are to reoffend is biased against black people, an investigation by ProPublica has found.
The COMPAS tool (Correctional Offender Management Profiling for Alternative Sanctions), created by Northpointe, plays a role across the US in determining when criminal defendants should be released.
Northpointe’s “risk assessment” score found that black people were almost twice as likely than white people to be assessed as “higher risk” when it came to the probability that they would commit another crime.
The investigative study compared the predictions for 10,000 inmates in Broward County, Florida, with the rate of reoffending that actually occurred over a two-year period.
Using questions like: “Was one of your parents ever sent to jail or prison?” and “How often did you get in fights while at school?” avoided asking about race outright, but were often related.
Other questions assessed education levels and employment.
Black people were found to be 77 per cent more likely to commit a violent crime in the future and 45 per cent more likely to commit any kind of crime in the future.
ProPublica also found that only 20 per cent of Northpointe’s predictions on future violent crimes are accurate.
Northpointe stated that it “does not agree that the results of [their] analysis, or the claims being made based upon that analysis, are correct or that they accurately reflect the outcomes from the application of the model.”
Northpointe’s algorithm is just one on the market, but there are dozens of similar tools which are being used increasingly by judges, probation and parole officers, according to ProPublica.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments