Understanding the concept of cognitive biases
With the readings by David McRaney and Drew Calvert in mind, write an essay in which you discuss the challenges that cognitive biases can pose in our attempts to learn, make decisions, and solve problems. In what ways can an awareness of such biases be of use to us in important areas of life?
In your essay, you must convey a clear understanding of the texts, either by integrating them into your argument or, at least, by summarizing them. Your essay should include no fewer than two citations, in MLA style, from each of the readings. Citations should include page numbers, not paragraph numbers.
Take some time at the beginning of the exam period to plan and organize your thoughts. Leave time at the end to proofread and make corrections. You may use a dictionary. Any time you use the exact words from the readings, you must use quotation marks. Any time you quote, paraphrase, or summarize, you must refer to the author by his or her last name.
It’s hard to venture online these days—or switch on any cable network—without coming across a heated discussion over “fake news.” Basic facts and figures now appear to be under negotiation and, for many media consumers, it can feel as if we are living through an entirely new dystopian era, each news cycle or press conference sending us further down the rabbit hole. But although the term “fake news” reflects our troubled political moment, the phenomenon is nothing new, and neither is the psychology that explains its persistence. “There’s a tendency for people to say, ‘Well, given the social media channels we have now, these things can spread more quickly and have a greater effect than ever before,’” says Adam Waytz, Associate Professor of Management and Organizations at the Kellogg School. “There’s actually more to it than that. Many of us remember when the most prominent news outlets in the world were reporting Iraq might have weapons of mass destruction. That was before Facebook and Twitter.”
To understand how people in the same country (or same family) can have such vastly differing takes on reality, Waytz recommends we focus not on the role of social media but on that of social psychology—in particular, the cognitive bias that stems from our tribal mentalities. To help explain our enduring susceptibility to fake news, he points to two well-known psychological concepts. The first is “motivated reasoning,” our drive to believe whatever confirms our opinions. “If you’re motivated to believe negative things about a politician, you’re more likely to trust outrageous stories about her,” says Waytz. “Over time, this can lead to a false social consensus.” The second concept is “naïve realism,” our tendency to believe our view of reality is the only accurate one, that people who disagree with us are necessarily uninformed, irrational, or biased. This explains the chasm in political discourse (instead of disagreeing with our opponents, we discredit them) and why some are quick to label any report that challenges their worldview as fake.
The impact of cognitive biases on decision-making and problem-solving
Much of our susceptibility to fake news has to do with how our brains are wired. We like to think our political convictions correspond to a higher truth, but in fact they might be less robust and more malleable than we realize. To some extent, says Waytz, our political beliefs are not so different from our preferences about music or food. “There’s an assumption that fake news exacerbates polarization. But it might be the case that polarization exacerbates fake news.” In one study, Waytz and his fellow researchers presented participants with a number of statements. These included factual statements that could be proven or disproven (such as “the very first waffle cone was invented in Chicago, Illinois”), preference statements that people could assess subjectively (such as “any ice cream flavor tastes better when served in a crunchy waffle cone”), and moral-political belief statements that people could assess in terms of right or wrong (such as “it is unethical for businesses to promote sugary products to children”). In the study, some participants were asked to read and rate statements as resembling a fact, a preference, or a moral belief, while a second group of participants had their brains scanned using fMRI while reading and evaluating each statement. Waytz found that, in both groups of participants, people processed the moral-political beliefs more like preferences than like facts. Not only did participants directly rate moral-political beliefs as “preference-like,” but, says Waytz, “when they read moral-political statements while having their brains scanned, the scans showed a pattern of activity that’s comparable to preferences.”
Though it may be disconcerting that our brains treat political beliefs like ice cream flavors, it also suggests that certain beliefs—like certain preferences—are susceptible to change. “We’ve all had the experience of disliking a band but later becoming a fan,” Waytz says, “and our taste for certain foods certainly evolves throughout the course of our lives.” This is particularly true for beliefs with mixed public consensus. A belief like “child labor is acceptable,” against which consensus is high, is processed much like a fact. But more controversial beliefs, such as “dog racing is unacceptable,” are more susceptible to persuasion and attitude change and are overwhelmingly a product of the social consensus within a specific community. This is why “fake news” is not just about social media or our tendency to skim the news—though sites like Twitter and Facebook do give misinformation the channels to spread at an unprecedented pace, and six in ten Americans only read headlines. Whatever the news source, the combined effects of motivated reasoning, naïve realism, and tribalism prevent us from reaching objective conclusions. According to Waytz, this is also why challenging falsehoods online might be a fool’s errand. Satisfying though it may be to correct the inaccuracies or lies posted by political opponents, deferring to the official record does not change the underlying social dynamic. “One thing we’re learning,” Waytz says, “is that fact-based arguments don’t always work.” (See Dartmouth political science professor Brendan Nyhan’s 2014 study that found that even presenting parents with hard scientific evidence that vaccines do not cause autism did nothing to persuade those parents who had previously held that belief.)
So, how do we overcome biases and counteract the polarization that fuels “fake news”? Waytz says social psychology also points to a way forward. Encouragingly, there is evidence that when you alert people to their biases, they tend to succumb to them less. “When people are told, ‘Hey, this bias exists,’ even the most hawkish among them are more conciliatory,” Waytz says. Some studies show that people can overcome naïve realism by legitimizing one of their opponent’s legitimate (or semi-legitimate) points. “If a Democrat and a Republican get together, and you have each of them offer a single argument from their opponent’s side, it makes them more open to the idea that their reality is not the only one,” Waytz says. (Interestingly, when people are given a financial incentive to reflect on views opposed to their own, they are even less biased in the judgements they make about the other side.) Waytz also suggests the current level of political discord may not last forever, at least when it comes to issues impacting people’s daily lives, because we are most susceptible to many cognitive biases when we process information only shallowly. “We know that people process information more deeply when there’s potential for them to lose something,” he says. But bridging the gap between two opposing views of reality might require deeper engagement with a more diverse set of data and news sources beyond what Twitter and Facebook can offer. “The biggest danger isn’t actually fake news—it’s tribalism,” Waytz says. “Depolarization only occurs when someone has the courage to speak out against their tribe.”