"Why Debunking Myths Can Backfire"

An excellent post, courtesy Mark Thoma at Economist’s View, from FactCheck.org, on why its efforts to correct the record are too often counterproductive. In essence, if ideas are falsely linked (say “Saddam Hussein” and “Al Queda”), further discussion preserves the false association.

The entire post is very much worth reading; it’s a primer on how the Big Lie works. And it offers at least one prescription: it is important to strike out against an inaccuracy as fast as possible before the pattern gels in the public psyche.

But otherwise, the article gives an image of people who for the most part are hopelessly suggestible. And while that is no doubt true, the article failed to note that the degree of gullibility varies by culture. In my limited frame of reference, it correlates with respect for authority. Australians, who are exceedingly egalitarian, are highly critical readers of the press and often question what they are told. Americans, despite our individualistic self-image, are deferential towards people in power and too often simply consume media information. Japanese (except the highly placed insides who know better) are even more trusting of their information sources.

From FactCheck.org:

Have you heard about how Al Gore claimed to have invented the Internet? What about how Iraq was responsible for the attacks on the World Trade Center? Or maybe the one about how George W. Bush has the lowest IQ of any U.S. president ever? Chances are pretty good that you might even believe one (or more) of these claims. And yet all three are false. At FactCheck.org our stock in trade is debunking these sorts of false or misleading political claims, so when the Washington Post told us that we might just be making things worse, it really made us stop and think.

A Sept. 4 article in the Post discussed several recent studies that all seemed to point to the same conclusion: Debunking myths can backfire because people tend to remember the myth but forget what the debunker said about it. As Hebrew University psychologist Ruth Mayo explained to the Post, “If you think 9/11 and Iraq, this is your association, this is what comes in your mind. Even if you say it is not true, you will eventually have this connection with Saddam Hussein and 9/11.” That leaves myth busters like us with a quandary: Could we, by exposing political malarkey, just be cementing it in voters’ minds? Are we contributing to the problem we hope to solve?

Possibly. Yet we think that what we do is still necessary. And we think the facts back us up.

The Post story wasn’t all that surprising to those who follow the findings of cognitive science research, which tells us much of our thinking happens just below the level of consciousness. The more times we hear two particular bits of information associated, for example, the more likely it is that we’ll recall those bits of information. This is how we learn multiplication tables – and why we still know the Big Mac jingle.

Our brains also take some surprising shortcuts. In a study published in the Journal of Personality and Social Psychology, Virginia Tech psychologist Kimberlee Weaver shows that the more easily we recall something the more likely we are to think of it as being true. It’s a useful shortcut since, typically, easily recalled information really is true. But combine this rule with the brain’s tendency to better remember bits of information that are repeated frequently, and we can run into trouble: We’re likely to believe anything we hear repeated frequently enough. At FactCheck.org we’ve noted how political spin-masters exploit this tendency ruthlessly, repeating dubious or false claims endlessly until, in the minds of many voters, they become true. Making matters worse, a study by Hebrew University’s Mayo shows that people often forget “denial tags.” Thus many people who hear the phrase “Iraq does not possess WMDs” will remember “Iraq” and “possess WMDs” while forgetting the “does not” part.

The counter to this requires an understanding of how it is that the brain forms beliefs.

In 1641, French philosopher René Descartes suggested that the act of understanding an idea comes first; we accept the idea only after evaluating whether or not it rings true. Thirty-six years later, the Dutch philosopher Baruch de Spinoza offered a very different account of belief formation. Spinoza proposed that understanding and believing happen simultaneously. We might come to reject something we held to be true after considering it more carefully, but belief happens prior to the examination. On Spinoza’s model, the brain forms beliefs automatically. Rejecting a belief requires a conscious act.

Unfortunately, not everyone bothers to examine the ideas they encounter. On the Cartesian model, that failure results in neither belief nor disbelief. But on the Spinozan model we end up with a lot of unexamined (and often false) convictions.

One might rightly wonder how a 17th-century philosophical dispute could possibly be relevant to modern myth-busting. Interestingly, though, Harvard psychologist Daniel T. Gilbert designed a series of experiments aimed specifically at determining whether Descartes or Spinoza got it right. Gilbert’s verdict: Spinoza is the winner. People who fail to carry through the evaluation process are likely to believe whatever statements they read. Gilbert concludes that “[p]eople do have the power to assent, to reject, and to suspend their judgment, but only after they have believed the information to which they have been exposed.”

Gilbert’s studies show that, initially at least, we do believe everything we hear. But it’s equally obvious that we reject many of those beliefs, sometimes very quickly and other times only after considerable work. We may not be skeptical by nature, but we can nonetheless learn to be skeptical. Iowa State’s Gary Wells has shown that social interaction with those who have correct information is often sufficient to counter false views. Indeed, a study published in the Journal of Applied Psychology by the University of Southern California’s Peter Kim shows that meeting a charge (regardless of its truth or falsity) with silence increases the chances that others will believe the claim. Giving false claims a free pass, in other words, is more likely to result in false beliefs (a notion with which 2004 presidential candidate John Kerry, who didn’t immediately respond to accusations by a group called Swift Boat Veterans for Truth about his Vietnam record, is all too familiar).

So, yes, a big ad budget often trumps the truth, but that doesn’t mean we should go slumping off in existential despair. You see, the Spinozan model shows that we will believe whatever we hear only if the process of evaluating those beliefs is somehow short-circuited. Humans are not helpless automatons in the face of massive propaganda. We may initially believe whatever we hear, but we are fully capable of evaluating and rejecting beliefs that turn out not to be accurate. Our brains don’t do this naturally; maintaining a healthy skeptical attitude requires some conscious effort on our part. It also requires a basic understanding of logic – and it requires accurate information. That’s where this Web site comes in.

If busting myths has some bad consequences, allowing false information to flow unchecked is far worse. Facts are essential if we are to overcome our brain’s tendency to believe everything it hears. As a species, we’re still pretty new to that whole process. Aristotle invented logic just 2,500 years ago – a mere blink of the eye when compared with the 200,000 years we Homo sapiens relied on our brain’s reflex responses to avoid being eaten by lions. We still have a long way to go. Throw in a tsunami of ads and Internet bluster and the path gets even harder, which is why we’re delighted to find new allies at PolitiFact.com and the Washington Post’s FactChecker. We’ll continue to bring you the facts. And you can continue to use them wisely.

Print Friendly, PDF & Email

7 comments

  1. minka

    I think it’s not enough to debunk facts. I think it’s important to create a story from the debunked facts: a story of being lied to.

    That is the real story, after all.

    Also: being lied to repeatedly should be pushed into characterizing the liar as a liar.

    The rightwing operates so often through character assassination. They need to be held accountable and undercut by having 1. their lies identified and 2. being identified as liars.

    The latter point is important, because if you can plant it in people’s brains, their subsequent lies will be less effective.

    Liberals are too polite. Our national welfare is on the line here.

  2. Anonymous

    “I think the people who respond don’t do so anywhere near forcefully enough.” —- I think that one should also consider that maybe people who want to respond forcefully, do not have a medium to convey their views.

    Ven

  3. Lune

    Fascinating post, Yves.

    To add to the analysis, I think looking at the writings of Thomas Kuhn (who invented the term “Paradigm Shift”) can help make sense of how 2 people presented with the same set of facts can fervently believe diametrically opposite truths (and indeed, believe that the facts help confirm both truths).

    Kuhn’s hypothesis in a nutshell (apologies if you already know this) was that everyone has a set of beliefs about the way the world works, and this constitutes the paradigm. While in this paradigm, no single observation will (in the mind of the person believing in that paradigm) invalidate that paradigm, no matter how anomalous or contradictory it is. Rather, those troublesome facts are either accounted for by some tortured logic or simply cast off as observation errors or some other minor level of noise.

    But gradually anomalies accrue, and suddenly, the old paradigm is completely discarded and (eventually), a new paradigm is created that accounts for the anomalies.

    The key point is that scientific progress isn’t a series of gradual improvement as new observations are used to refine and confirm/invalidate theories by the rigorous, impartial and rational application of the scientific method. Rather, people can rationalize any number of illogical or contradictory facts until finally there’s a sudden jump to a completely new worldview (hence the paradigm shift).

    But to take a simplified example: say the stock market goes down today. That’s a fact. On the one hand, analysts who were urging people to buy yesterday are probably still doing the same, and likely even more strenuously(based on their theory that the assets are now even more undervalued after the decline), while the bears, looking at the same market data, say it’s an even better time to sell (since the market decline has confirmed their view that assets are overpriced).

    In other words, the same fact has led to further confirmation of two completely contradictory worldviews (assets are underpriced vs assets are overpriced) in the minds of the people who believe in each respective worldview.

    Similarly, which paradigm you believe in leads you to attribute more importance to certain facts and less importance to others (hence why bears tend to dismiss the fact that the stock market continues to reach new highs, corporate profits are still up, etc; while bulls tend to dismiss the fact that house prices continue to decline, the dollar is still falling, etc. etc.)

    This is perhaps the central debate in the philosophy of science these days (between the paradigm shifts of Thomas Kuhn and the critical rationalism of Karl Popper). Anyway, as an old philosophy major, I’m always amazed at how useful Kuhn’s theory of paradigm shifts is to understanding what (to me) seems irrational behavior (although in fairness, I’d have to accept that my behavior likely appears “irrational” in someone else’s paradigm as well).

  4. Yves Smith

    Ven,

    Point well taken, particularly given the attacks on MoveOn.org.

    Lune,

    I am familiar with Kuhn, but that was a nice synopsis.

    If you can dig up a copy, you might enjoy Jim Grant’s The Trouble With Prosperity. It’s a good read, and in passing, he goes through some older investment paradigms that he shows were deeply held yet have been replaced with newer world views.

Comments are closed.