Imagine a computer program that can predict if a criminal will strike again. Now, imagine that program isn’t accurate. ProPublica senior reporters and editors poke holes in one of the dozens of risk assessment programs that determine the fate of defendants across the United States. They found that one popular tool incorrectly flagged black defendants as likely reoffenders twice as often as it did for white defendants, exposing some to longer incarceration times. getAbstract recommends this revealing analysis to anyone enthusiastic or skeptical about the use of algorithms in the American judicial system.
About the Authors
Julia Angwin, Jeff Larson and Lauren Kirchner hold senior reporting and editing positions at ProPublica. Surya Mattu is a contributing researcher.
Instant access to over 20,000 book summaries
Discover your next favorite book with getAbstract.
See prices >>
Stay up-to-date with emerging trends in less time.
Learn more >>
Customers who read this summary also read
Comment on this summary
2 years agoThus, Machine Bias (that is, programmed bias) is simply an extension of faulty individual and societal biases; which are based on biased laws (still in existence) created with intent by biased individuals within that society for the dominant system to function as it does.