In my October installment of “Brain Science,” I described what cognitive psychology has shown about why misinformation is “sticky.” I delved into why many people still believe something like the erroneous claim that weapons of mass destruction were found in Iraq after the 2003 invasion even after hearing that claim disproved repeatedly. Once we think something is a fact, it’s hard to replace it with a new fact. The reasons I cited then were numerous. We’re wired to assume, initially, that an assertion is true, because questioning it is cognitively more demanding than accepting it. We’re also inclined toward “motivated reasoning,” a form of cognitive bias whereby we’re more likely to accept claims that conform to our existing beliefs. In the case of Iraq’s supposed WMDs, that belief might be that no evil is too great for Saddam Hussein or that the US invasion (sold to the public through the WMD claim) was justified. To someone who holds either or both beliefs, acknowledging the absence of WMDs produces a feeling of cognitive uneasiness and even threat, because the foundation of the belief (Saddam: evil; invasion: justified) is now shaky. As I wrote in the earlier column, “Misinformation is sticky because evicting it from our belief system requires cognitive effort,” and if it “fits with our worldview, then…the debunking clashes with that view” and makes us uncomfortable. All of these factors are still in play. But so is a growing phenomenon: call it cognitive tribalism. We have many ways to identify ourselves as part of a community that gives us a sense of belonging and purpose. Nationality. Religion. Gender. Even sports-team allegiance. We all want, and even need, to feel part of something larger than ourselves. The many reasons for our need to belong are worthy of a future column, but one of the key drivers is our desire to lessen the sting of mortality: I will die, but the communities of which I am a part and to which I have contributed will live on. “We are relational beings,” said Drew Pate, a psychiatrist at Maryland-based Sheppard Pratt Health System. “Finding ways to be connected is something we seek out.” In a time of hyperpartisanship in the United States, people are using “ideology or politics as a means to connection and a way to self-identify” just as much as they use ethnicity, nationality, or other traditional affiliations, according to Pate. Such affiliations have enormous power to shape our thinking and belief. Consider a sports bar debate over who is the greatest quarterback ever: Peyton Manning or Tom Brady? New Englanders who consider their Patriots allegiance a core part of their identity will go to their graves insisting it’s Brady—even though in the five playoff games where the two faced off Manning won three times. “You may not totally believe something,” said Pate, “but you feel you have to agree with it to affirm your sense of identity.” Some of us are more prone to this than others. In a 2017 study in the journal Applied Cognitive Psychology, Jan-Willem van Prooijen of Vrije University Amsterdam queried some 5,000 people about whether they believed conspiracy theories, such as “there has been a free energy source for a long time, but the oil industry tries to keep this a secret,” or that astronauts “never really landed on the moon; everything was recorded in TV studios.” Those who had received more education were less likely to believe these claims than people who had received less education. But the discrepancy he uncovered is not about intelligence. One of the strongest predictors of belief was feeling powerless, and that is associated with education: People who drop out of school are less likely to get jobs and live lives that make them feel empowered. When you feel that forces beyond your control are buffeting you, you can be driven to believe in even conclusively debunked conspiracies, van Prooijen concluded. But feeling powerless does something else, too: it strengthens one’s affiliation to a group. Such chosen affiliations are independently correlated with believing debunked claims. In a 2010 study in the Journal of Experimental Psychology, for instance, researchers led by David Weise, then at the University of Arizona, found that the more people focused on their group identity—or “social category”—the more likely they were to accept the validity of an untruth that painted those outside the group in a bad light. Which brings us to my unicorn poll responder. For years she gleefully told pollsters that yes, she believes Obama is Muslim and that he wasn’t born in the United States. On the first, according to a 2010 Newsweek poll, she was in the company of about one-quarter of US adults. On the second, 72% of Republicans doubted or flat-out disagreed with the statement that Obama was born in the US, according to a 2016 NBC News poll. But when I asked if she really and truly believed either, she hedged. She “wanted to show them” (the pollsters) that there were many who opposed Obama’s policies (she disliked the Affordable Care Act), and answering as she did to questions about his faith and birth seemed like a good way to do that. It was also, she said, a way for her to feel part of the bloc that opposed him, cementing an identity that had become important to her…like Patriots fans will profess their eternal belief in Brady’s greatness even if they harbor doubts. Belonging to and aligning with a community is a stronger drive than absolute truth about one’s beliefs. Such cognitive tribalism is manifesting itself ever more strongly. Political divisions in the US and elsewhere are growing sharper, becoming what psychologists call salient, or top-of-mind, in a way that they weren’t a few generations ago. Salient categories of self-identification (Patriots fan) are exactly the kind that skew our perceptions and judgments. Affiliations create filters that govern how people’s actions and assertions are perceived and make it hard for people to see eye to eye. Many liberals hold up Bill Clinton as a hero, while conservatives see him as a disgraced president who narrowly avoided being convicted by the Senate after his 1998 impeachment. And liberals are of course not immune to believing myths shared by members of the cognitive tribe they identify with. According to a 2007 poll, 40% who described themselves as liberals said the US government perpetrated the 9/11 attack or allowed it to happen. Likewise, cognitive tribalism is at work when supporters and non-supporters of Donald Trump see two very different people. If it seems incredible to some that supporters accept his assertion that 5 million people voted illegally or that his inauguration crowd dwarfed all others, it shouldn’t. “They see him as their voice,” Frank Luntz, a Republican consultant and pollster, told reporters last May. “And when their voice is shouted down, disrespected, or simply ignored, that is an attack on them, not just an attack on Trump.” When one is attacked, one doubles down on a belief if that belief forms an important basis for self-identification. “Affect and cognition evolved together, and we don’t make any political decisions strictly cognitively,” said Emory University psychologist Drew Westen, who applies psychological science to politics. The more a politician’s supporters feel that that support defines them, and then perceive that he is being attacked, the more likely they are to succumb to the cognitive tribalism that leads people to agree with false, debunked statements…as long as doing so strengthens their sense of belonging and identity.
Brainware: Us vs. Them
The cognitive bias that looks at our own group more favorably than others’ has been discussed for over a century. Sociologist William Sumner, in 1906, wrote about the opposition in our minds between the “we-group” and the “othersgroups.” In the “struggle for existence,” he postulated, a person tends to show favoritism toward their own group, and ultimately, “looks with contempt on outsiders.” His “ethnocentrism” has since broadened to “in-group bias,” the tendency to exaggerate the positive qualities of those we identify with and ascribe negative traits to outsiders. https://www.mindful.org/the-stickiness-of-misinformation/ https://www.mindful.org/beware-biased-brain/