- Eye Opening
When most people think about the human capacity for reason, they imagine that facts enter the brain and valid conclusions come out. Science reveals this isn’t the case. People’s ability to reason is subject to a staggering number of biases. But what if the human capacity for reason didn’t evolve to help us solve problems; what if its purpose is to help people survive being near each other? getAbstract recommends Pulitzer Prize–winning author Elizabeth Kolbert’s thought-provoking article to readers who want to know why people stand their ground, even when they’re standing in quicksand.
- Human thinking is deeply flawed and prone to predictable biases.
- Cognitive biases may have evolved to help humans argue as they thrived and cooperated in close-knit groups.
- People often mistake the boundaries between their knowledge and the knowledge of others in their social group.
- People tend to fall for arguments that evoke strong emotions rather than those based in fact.
- When people are forced to examine the details of an issue, they tend to admit that they know less than they previously thought; this, and science, offer hope for humanity.
In the mid-1970s, Stanford University began a research project that revealed the limits to human rationality; clipboard-wielding graduate students have been eroding humanity’s faith in its own judgment ever since. Why is human thinking so flawed, particularly if it’s an adaptive behavior that evolved over millennia? Cognitive scientists Hugo Mercier and Dan Sperber have written a book in answer to that question. In The Enigma of Reason, they advance the following idea: Reason is an evolved trait, but its purpose isn’t to extrapolate sensible conclusions from studying the available data. When living in a close-knit group, there is often more to be gained from winning disputes than there is from actual problem solving. “My-side bias” helps people spot the flaws in the other person’s argument while remaining blind to weaknesses in their own.
“The vaunted human capacity for reason may have more to do with winning arguments than with thinking straight.”
Another book, The Knowledge Illusion: Why We Never Think Alone, by professors Steven Sloman and Philip Ferbach, adds to the theory that flawed thinking has roots in human sociability. They argue that people are often unaware of the boundaries between their knowledge and that of the people on whom they rely. This ignorance has worked to society’s advantage, however, because “if everyone had insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amounted to much.”
“They cite research suggesting that people experience genuine pleasure – a rush of dopamine – when processing information that supports their beliefs.”
Despite the historical benefits the socially designed human brain poses, it’s clear that my-side bias and other flawed modes of thinking cause problems in modern times. Nowhere is this clearer than in politics. Jack Gorman and Sara Gorman, authors of Denying to the Grave: Why We Ignore the Facts that Will Save Us, pinpoint the problem: People disregard accurate information, instead tending to fall for arguments that appeal to emotion.
Hope springs eternal, however, particularly for researchers who’ve discovered that people will amend erroneously high assessments of their own knowledge when asked to explain the details of a policy or process. If people are confronted by their lack of knowledge, they might seek facts accordingly. Science, too, offers hope, because when done right, reproducible experiments eliminate bias.
About the Author
Elizabeth Kolbert is the Pulitzer Prize–winning author of The Sixth Extinction: An Unnatural History. She has written for The New Yorker since 1999.
This document is restricted to personal use only.