According to the most recent CNN/ORC poll, billionaire Donald J. Trump is leading the field of Republican candidates for president with 41 percent. Texas Senator Ted Cruz is a little more than 20 points behind. This, in spite of the fact, that when it comes to the facts, Mr. Trump has the most appalling record all candidates, Republican and Democratic.
Of 83 statements checked by Politifact (Jan. 24), 7 percent were rated True or Mostly True, 16 percent Half-True, 57 percent False or Mostly False, and a whopping 20 percent are so false that Politifact rates them as Pants-on-Fire.
If 77 percent of what comes out of Trump’s mouth on the stump is patently false, why do supporters continue to believe he’s trustworthy enough for the highest office in the land? While that question clearly encompasses many aspects, the purpose of this series is to attempt to answer the key question raised in the title: Why do people ignore facts?
In my 2011 book, Shameless, Jamie O’Boyle, senior analyst for The Center for Cultural Studies and Analysis writes, “The human brain is a meaning-seeking device. Although Western culture has a strong bias toward the importance of conscious ‘rational’ thought, over 90% of our decisions are made at an unconscious level. Thanks to brain imaging, we now know that, when the brain inputs data, the emotional centers light up first (what does this mean to me?), followed by the logic centers (what do I do with it?). To a very large degree, this means that ‘facts’ are what people use to validate decisions already made at an unconscious level.
“One of the outcomes of this process is a confirmation bias – the tendency of our brain to easily accept information compatible with what we already know and – more importantly – minimize information that contradicts what we already know, even if what we ‘know’ isn’t true!
“The unconscious weighing of information is one of the reasons it is so difficult to change people’s minds using logic. The information goes in but the importance the brain allots to each bit minimizes the effect of negative data while weighting more heavily the bits that already fit their preconceptions and worldview.
“This is a principal reason,” O’Boyle concludes, “why people don’t recall that commentators have given them information that was proven to be false. Their unconscious brain simply diminished its importance in favor of some other bit of information, and even the little that did get through faded rapidly from memory.”
Newsweek magazine’s science editor and author, Sharon Begley, writes “Women are bad drivers, Saddam plotted 9/11, Obama was not born in America, and Iraq had weapons of mass destruction: to believe any of these requires suspending some of our critical-thinking faculties and succumbing instead to the kind of irrationality that drives the logically minded crazy.
“It helps, for instance, to use confirmation bias (seeing and recalling only evidence that supports your beliefs, so you can recount examples of women driving 40mph in the fast lane). It also helps not to test your beliefs against empirical data (where, exactly, are the WMD, after seven years of U.S. forces crawling all over Iraq?); not to subject beliefs to the plausibility test (faking Obama’s birth certificate would require how widespread a conspiracy?); and to be guided by emotion (the loss of thousands of American lives in Iraq feels more justified if we are avenging 9/11).”
In a New York Times op-ed about the myths surrounding the newly passed health care bill, political scientist and health policy researcher Brendan Nyhan at the University of Michigan writes, “Jason Reifler, a political scientist at Georgia State, and I conducted a series of experiments in which participants read mock news articles with misleading statements by a politician. Some were randomly assigned a version of the article that also contained information correcting the misleading statement.
“Our results indicate that this sort of journalistic fact-checking often fails to reduce misperceptions among ideological or partisan voters. In some cases, we found that corrections can even make misperceptions worse. For example, in one experiment we found that the proportion of conservatives who believed that President George W. Bush’s tax cuts actually increased federal revenue grew from 36 percent to 67 percent when they were provided with evidence against this claim. People seem to argue so vehemently against the corrective information that they end up strengthening the misperception in their own minds.
“We’ve seen this happen already with Sarah Palin’s claim that her parents and baby would ‘have to stand in front of Obama’s death panel.’ After this claim was widely discredited in the press, some conservative pundits retreated to claims that future rationing of health care would amount to ‘de facto death panels.’ ”
Delivering a commencement address at Yale University, President Kennedy said, “The great enemy of the truth is very often not the lie – deliberate, contrived and dishonest, but the myth, persistent, persuasive and unrealistic. Belief in myths allows the comfort of opinion without the discomfort of thought.”
“Confirmation bias can be modified,” O’Boyle says. “If you slow down the process.”
Coming Friday: What does that mean?