Join getAbstract to access the summary!

Machine Bias

Join getAbstract to access the summary!

Machine Bias

There’s software used across the country to predict future criminals. And it’s biased against blacks.

ProPublica,

5 min read
5 take-aways
Audio & text

What's inside?

Racial disparities show up in all levels of criminal justice. What if your efforts to be fairer backfired?

auto-generated audio
auto-generated audio

Editorial Rating

8

Qualities

  • Controversial
  • Analytical
  • Bold

Recommendation

Imagine a computer program that can predict if a criminal will strike again. Now, imagine that program isn’t accurate. ProPublica senior reporters and editors poke holes in one of the dozens of risk assessment programs that determine the fate of defendants across the United States. They found that one popular tool incorrectly flagged black defendants as likely reoffenders twice as often as it did for white defendants, exposing some to longer incarceration times. getAbstract recommends this revealing analysis to anyone enthusiastic or skeptical about the use of algorithms in the American judicial system. 

Summary

Police arrested Brisha Borden and Vernon Prater separately for petty theft. But a computer program said Borden, who’s black, was more likely to reoffend, although Prater, who’s white, had a worse record. While Borden stayed out of trouble, Prater ended up serving time in prison for subsequent burglary and theft. 

Dozens of US courts – and maybe federal prison soon – rely on risk assessments at each stage of the criminal justice process to assign bond and decide sentence length, although the results don’t have predictive value. Merely 20% of more...

About the Authors

Julia Angwin, Jeff Larson and Lauren Kirchner hold senior reporting and editing positions at ProPublica. Surya Mattu is a contributing researcher.


Comment on this summary

  • Avatar
  • Avatar
    N. P. 2 years ago
    Problem and dilemma is very clear and also mentioned some of the key important factors. It could still more be better with more key facts and little longer.
  • Avatar
    A. J. 6 years ago
    Thus, Machine Bias (that is, programmed bias) is simply an extension of faulty individual and societal biases; which are based on biased laws (still in existence) created with intent by biased individuals within that society for the dominant system to function as it does.