The New Mind Control

The New Mind Control

Aeon,

5 min read
5 take-aways
Audio & text

What's inside?

The Internet influences users in a variety of subtle ways, but could tech companies fix an election?

auto-generated audio
auto-generated audio

Editorial Rating

9

Qualities

  • Innovative

Recommendation

To what degree are search engines, such as Google, able to influence Internet users’ perceptions? Senior research psychologist Robert Epstein details a series of experiments his team conducted to study the effects of biased search results on people’s opinions and political voting choices. Epstein’s well-explained findings, in turn, reveal the frightening implications of unregulated technology’s capacity to manipulate opinions on a global scale. getAbstract recommends this article to everyone interested in technology and social science trends.

Take-Aways

  • No one knows exactly how Google chooses to rank its search results, but it’s undeniable that most people choose to view the highest-ranked websites displayed for any given search query.
  • In a 2013 experiment, researchers tested a fake search engine – “Kadoodle” – on 102 people from San Diego, California. Participants viewed search results that portrayed certain political candidates in positive ways.
  • In 2014, it performed the same experiment with 2,150 registered voters in India.
  • Results revealed that ranking search results to “invisibly” portray one candidate in a positive light can shift people’s opinions significantly.
  • If tech companies are allowed to proceed unchecked, their technology makes it possible for an “unseen dictatorship” to rise, using the very rules and tools of “democratic government.”

Summary

No one knows exactly how Google chooses to rank its search results, but it’s undeniable that most people choose to view the highest-ranked websites displayed for a given search query. That being the case, could search rankings sway an undecided user toward a politician or stance on a controversial issue?

“If Google set about to fix an election, it could identify just those voters who are undecided. Then it could send customized rankings favoring one candidate to just those people.”

In a 2013 experiment, designed to determine the strength of search engines’ influence, researchers divided 102 people from the San Diego, California, area into three groups. Each group used a fake search engine – “Kadoodle” – which displayed rigged results. Researchers instructed participants to perform searches on the 2010 Australian federal election. Group one viewed search results that positively portrayed a certain candidate, group two saw results favoring a different candidate and group three received mixed search results. Study outcomes revealed that ranking search results to “invisibly” portray one candidate in a positive light can shift user views anywhere from 37.1% to 80%. In a 2014 follow-up experiment, the same research team brought Kadoodle to “the largest democratic election in the world” – India’s race for prime minister. It tested 2,150 undecided, registered voters from across India. They were instructed to perform searches on “randomly assigned” candidates: Rahul Gandhi, Arvind Kejriwal, and Narendra Modi. The team expected participants to have a reasonable familiarity with the politicians, and therefore little swaying of opinion. Surprisingly, the shift in user perceptions deviated only slightly from the San Diego experiment: from 20% to 60%. The results of these experiments gave rise to a new term: the “Search Engine Manipulation Effect,…one of the largest behavioral effects ever discovered.”

“The technology has made possible undetectable and untraceable manipulations of entire populations that are beyond the scope of existing regulations and laws.”

A combination of lack of competition and an abundance of trust give Google vast amounts of power. Some evidence suggests that Google’s results favor its financial interests, but given that search ranking is an unregulated field, there’s no way to demand that Google “go public” about its biases. In theory, any tech company could use what it knows about its users to rig an election – or worse. If tech companies are allowed to proceed unchecked, their technology makes it possible for an “unseen dictatorship” to rise, using the very rules and tools of “democratic government.”

About the Author

Robert Epstein is a senior research psychologist at the American Institute for Behavioral Research and Technology. He has authored 15 books and is the former editor-in-chief of Psychology Today.

This document is restricted to personal use only.

Did you like this summary?

Read the article

This summary has been shared with you by getAbstract.

We find, rate and summarize relevant knowledge to help people make better decisions in business and in their private lives.

For yourself

Discover your next favorite book with getAbstract.

See prices

For your company

Stay up-to-date with emerging trends in less time.

Learn more

Students

We're committed to helping #nextgenleaders.

See prices

Already a customer? Log in here.

Comment on this summary