How Algorithms Rule Our Working Lives

How Algorithms Rule Our Working Lives

The Guardian,

5 min read
5 take-aways
Audio & text

What's inside?

In today’s job search, before you get to the interview, you have to make it past the algorithm.

auto-generated audio
auto-generated audio

Editorial Rating

8

Qualities

  • Eye Opening
  • Hot Topic

Recommendation

Companies regularly extract information from social media and personality tests before hiring new employees. Computer algorithms use the information to decide who companies should or shouldn’t hire. Math blogger Cathy O’Neil tackles the rise of algorithmic software and explores the dangerous implications of branding this new “pseudoscience” as “scientific.” She balances the benefits of efficiency and perceived objectivity to companies with the flaws and unethical nature of such algorithms. getAbstract recommends this eye-opening challenge to human resources officers and job seekers.

Take-Aways

  • In many companies, algorithms decide which résumés and job applications rise to the top of the hiring list.
  • Companies give personality tests and mine social media profiles and financial data to avoid making “bad hires.”
  • Algorithms can sift through thousands of job applications, picking top-rated candidates in seconds. The software assesses possible risks and identifies potential best hires.
  • Humans design these programs, and humans are inherently flawed. Businesses tout the systems as “scientific,” so it’s rare for anyone to question or doubt the conclusions.
  • Companies use such programs on “60% to 70% of prospective workers in the US.” Automatic résumé readers ensure that “some 72% of CVs are never seen by human eyes.”

Summary

When Kyle Behm, an Ivy League educated and capable young man, applied for an entry-level job at a supermarket, he was sure he’d get the job. So, when he found out his application had been turned down, it came as a shock. A friend who worked at the supermarket found out that Kyle had been “red-lighted” because of a personality test he took as part of the application process. Today, personality quizzes for job applicants are commonplace. Programs mine candidates’ social media and financial data for red flags. Software has become the judge, jury and executioner for many applicants. But it’s not just the hiring process that algorithms affect. These systems control insurance premiums, work schedules, loan applications and performance reviews.

“Given their scale and importance, combined with their secrecy, these algorithms have the potential to create an underclass of people who will find themselves increasingly and inexplicably shut out from normal life.”

There is no denying that hiring algorithms are efficient and can save businesses money. They can sift through thousands of job applications, picking top-rated candidates in seconds. They assess possible risks and identify potential top performers. However, people design these programs, and humans are inherently flawed. Businesses tout the systems as “scientific,” so it’s rare for anyone to question or doubt the conclusions. If only a few companies were using these personality tests, there might be less cause for concern. But, the tests “are used on 60% to 70% of prospective workers in the US.” Automatic résumé readers ensure that “some 72% of CVs are never seen by human eyes.” Those with the resources to craft their résumés to make it past the automatic readers have an advantage, while those without such resources remain unemployed.

“Most of these algorithmic applications were created with good intentions. The goal was to replace subjective judgments with objective measurements in any number of fields.”

While data are necessary for companies to improve and operate, the methods of data collection and the interpretation of the data are questionable. In the case of a veteran school teacher, consecutive algorithmic performance assessment tests show significantly different results though the instructor retained the same teaching methods. Another program that determined employee longevity unintentionally favored job applicants who didn’t live in poorer geographic areas. These models block people from opportunities every day, but many are unaware it was an algorithm that rejected them. How long can this continue before people begin to question the accuracy and fairness of these algorithms?

About the Author

Cathy O’Neil blogs at MathBabe.org and is the author of several books, including Weapons of Math Destruction.

This document is restricted to personal use only.

Did you like this summary?

Read the article

This summary has been shared with you by getAbstract.

We find, rate and summarize relevant knowledge to help people make better decisions in business and in their private lives.

For yourself

Discover your next favorite book with getAbstract.

See prices

For your company

Stay up-to-date with emerging trends in less time.

Learn more

Students

We're committed to helping #nextgenleaders.

See prices

Already a customer? Log in here.

Comment on this summary