If you were wrong about something important, how quickly would you want to know and how quickly would you want to do something about it? Unfortunately, the answer isn’t nearly as obvious as we’d like to think.
Mark Twain sagely noted that a lie can travel half way around the world while the truth is putting on its shoes. That’s because the truth is so much messier. Lies are created to be believable. They cater to our prejudices, whims, desires and hopes even when the truth cannot. Lies offer a good story when the truth does not. They are plausible when the truth is not. We often resist and even deny the truth. It is inherently unwieldy. It requires a careful sifting and analysis of facts in order to be discerned — we want deduction but are limited to induction most of the time. The truth is simply very hard to handle.
Of course, if we’re talking about relatively trivial matters (perhaps the distance from the earth to the moon) or about something we’re predisposed to believe anyway, we adjust our beliefs quite readily. But when truth doesn’t fit with what is important to us — when it matters — our perception of it gets caught up in who we perceive ourselves to be and in our vested interests. In those instances, attacking false information with data and related evidence often backfires, having the opposite of the desired effect. We like to think that straightforward education overcomes falsehoods, but things aren’t nearly that simple. This horrifying phenomenon — the backfire effect — was demonstrated once again recently in a study of the responses of parents to various forms of reporting that vaccines are not dangerous.
In separate trials, parents were provided materials from the Center for Disease Control (a) debunking misinformation regarding a link between autism and vaccines for measles, mumps and rubella; (b) describing the symptoms and adverse events associated with measles, mumps and rubella; (c) providing a mother’s narrative of her child’s hospitalization caused by measles; or (d) showing powerful Images of children suffering from measles, mumps and rubella. The idea was to see whether facts, science, stories or emotions are better at helping people to change their minds. However, none of these approaches increased the willingness of parents to vaccinate their children.
In fact, the material that refuted the vaccine-autism link successfully reduced false beliefs about the idea that vaccines cause autism but somehow actually reduced the intent to vaccinate in parents with the strongest anti-vaccine beliefs. Even worse, the images of children with measles, mumps and rubella and the mother’s narrative about her hospitalized child both had the unintended effect of actually increasing false beliefs in vaccine side effects.
In a related case, an internet rumor about flesh-eating bananas that was debunked by the CDC grew stronger after the debunking and even began to be attributed to the CDC. The backfire effect is even more pronounced in older people. In one study, for example, the more often older adults were told that a claim was false, the more likely they were to remember it as true later.
Simple education is not the answer to false belief. If you want to have any hope of refuting false beliefs, about the markets or anything else, consider the following (and check out The Debunking Handbook). For starters, don’t repeat the false belief in your debunking if you can help it. The refutation must focus on core facts rather than the myth to avoid the misinformation becoming more familiar and thus more lasting. Moreover, simpler is better. Use easy-to-read materials and limit the content to your two ar three best arguments. A simple and pleasing lie is more cognitively attractive than a complicated correction. Badmouthing the other side (and, implicitly, the reader) doesn’t help. On the other hand, providing a strong, simple and thus memorable soundbite helps a lot.
When people read a refutation that conflicts with their beliefs, they readily seize on any ambiguity to construct their own alternative interpretations. Therefore, clarity counts. Accordingly, be as straightforward as you can be and use graphics whenever possible.
Any mention of a false belief should be preceded by explicit warnings to notify the reader that the forthcoming information is false. In addition, the refutation should include an alternative explanation that accounts for important qualities in the original misinformation. That’s because we inherently prefer a false model of reality to an incomplete or uncertain but more accurate model. Thus a defense attorney in a murder trial should seek to provide an alternative suspect, for example. Doing so greatly reduces the likelihood of a guilty verdict.
When “hot button” issues are involved, it’s important to frame the issue so as not to be seen as attacking the self-worth of the people being corrected. Otherwise, the facts won’t be heard. In one study, for example, Republicans were far more likely to accept an otherwise identical charge as a “carbon offset” rather than a “tax”, whereas the wording difference had little impact on Democrats because their values were not challenged by the word “tax” the way Republicans’ were.
The investment world has a number of internecine battles that can rage hot, the active v. passive debate is simply the most obvious. Moreover, we live in a culture and work in an industry that doesn’t hold truth in the highest regard and dispenses precious little of it (spin is another matter). We (erroneously) like to think that the information deficit model of communication is true — explain people’s errors and they will self-correct.
As I so often say, we like to think that we see the world the way it really is. Instead, we see the world the way we really are. If we are to overcome our inherent biases and difficulties, it will take ongoing hard work, with no guaranty of success. Accordingly, our communication plans (and not just our beliefs) need themselves to be tentative and responsive to evidence that another approach or method works better. We need to experiment, to collect data about and measure the results of our communication efforts. As Yale’s Dan Kahan says, our approach needs to be evidence-based “all the way down.”
We remain people who are broken and deeply flawed. Fixing what is broken and correcting our many errors is itself a highly tentative and flawed process that often backfires. Caveat emptor, therefore, all the way down.
Pingback: Evidence is Not Enough | Above the Market
Pingback: Carolina Crazy | Above the Market
Pingback: “It’s not you. It’s me.” | Above the Market
Pingback: “It’s not you. It’s me.” - Investing Matters
Pingback: Getting Busy on the Proof | Above the Market