Homo economicus is a myth. This alleged “rational man” is as non-existent as the Loch Ness Monster, Bigfoot and (perhaps) moderate Republicans. Yet the idea that we’re essentially rational creatures is a very seductive myth, especially as and when we relate the concept to ourselves (few lose money preying on another’s ego). In fact, we tend to think that we’re almost superhuman in our ability to invoke reason to our advantage.
Think of Hamlet (Act II, scene 2), for example (with Sir Kenneth Branagh as the melancholy Dane in the clip below).
“What a piece of work is a man! How noble in reason! how infinite in faculty! in form, in moving, how express and admirable! in action how like an angel! in apprehension how like a god! the beauty of the world! the paragon of animals!”
Of course, there are some very good reasons to stand awestruck at what human reason can accomplish, particularly in the areas of science, technology and engineering. But you might also recall what became of Hamlet. We are much less rational than we assume. Continue reading
Noah Smith (@Noahpinion on Twitter) made an interesting assertion yesterday about the purpose of argument. Smith began by noting Boston University economist Laurence Kotlikoff’s op-ed in Forbes in which he acts as a concern troll toward New York Times columnist (and noted economist himself) Paul Krugman because Krugman allegedly called Congressman Paul Ryan stupid. To be clear, Krugman’s primary point was not that Ryan is stupid, but that he is crooked, especially as it pertains to his budget proposals. Smith uses this context for looking at arguments in general, and he makes an excellent point.
[A]s a society, we use arguments the wrong way. We tend to treat arguments like debate competitions — two people argue in front of a crowd, and whoever wins gets the love and adoration of the crowd, and whoever loses goes home defeated and shamed. I guess that’s better than seeing arguments as threats of physical violence, but I still prefer the idea of arguing as a way to learn, to bounce ideas off of other people. Proving you’re smart is a pointless endeavor (unless you’re looking for a job), and is an example of what Stanford University psychologist Carol Dweck calls a “fixed mindset.” As the band Sparks once sang, “Everybody’s stupid — that’s for sure” [even though nobody wants to be called stupid]. What matters is going in the right direction — becoming less stupid, little by little.
But I think Smith’s ideal isn’t all that practical. To begin with, as Megan McArdle emphasizes, by calling one who disagrees with you stupid (even implicitly) “you have guaranteed that no one who disagrees with you will hear a word that you are saying.” Thus “calling people stupid is simply a performance for the fellow travelers in your audience” as well as a means of asserting superiority.
My sense is that the key element to this discussion is that most partisans see “their side” as not just true, but obviously true. It’s a by-product of bias blindness, or selective perception. We tend to see bias in others but not in ourselves. Therefore, our strongly held positions aren’t really debatable — they’re objectively and obviously true. After all, if we didn’t think our positions were true, we wouldn’t hold them. And (our thinking goes) since they are objectively true, anyone who makes the effort to try should be able to ascertain that truth. Our opponents are thus without excuse. Continue reading
If you were wrong about something important, how quickly would you want to know and how quickly would you want to do something about it? Unfortunately, the answer isn’t nearly as obvious as we’d like to think.
Mark Twain sagely noted that a lie can travel half way around the world while the truth is putting on its shoes. That’s because the truth is so much messier. Lies are created to be believable. They cater to our prejudices, whims, desires and hopes even when the truth cannot. Lies offer a good story when the truth does not. They are plausible when the truth is not. We often resist and even deny the truth. It is inherently unwieldy. It requires a careful sifting and analysis of facts in order to be discerned — we want deduction but are limited to induction most of the time. The truth is simply very hard to handle.
Of course, if we’re talking about relatively trivial matters (perhaps the distance from the earth to the moon) or about something we’re predisposed to believe anyway, we adjust our beliefs quite readily. But when truth doesn’t fit with what is important to us — when it matters — our perception of it gets caught up in who we perceive ourselves to be and in our vested interests. In those instances, attacking false information with data and related evidence often backfires, having the opposite of the desired effect. We like to think that straightforward education overcomes falsehoods, but things aren’t nearly that simple. This horrifying phenomenon — the backfire effect — was demonstrated once again recently in a study of the responses of parents to various forms of reporting that vaccines are not dangerous. Continue reading
Our biases make it really hard to see things clearly.
Ezra Klein (formerly of The Washington Post) has a new venture (Vox) dedicated to what he calls “explanatory journalism” and which offers consistently progressive “explanations” for various policies by a talented but ideologically pure staff. Klein’s big introductory think piece cites research (already familiar to regular readers here) showing that people understand the world in ways that suit their preexisting beliefs and ideological commitments. Thus in controlled experiments both conservatives and liberals systematically misread the facts in a way that confirms their biases.
Interestingly, if unsurprisingly, while Klein concedes the universality of the problem in theory, all of his examples point out the biased stupidity of his political opponents. Paul Krugman – a terrific economist but an often insufferable progressive shill – sees Klein’s bid and ups the ante, exhibiting classic bias blindness: “the lived experience is that this effect is not, in fact, symmetric between liberals and conservatives.” In other words, his “lived experience” trumps the research evidence (science at work!). In Krugman’s view, conservatives are simply much stupider than liberals because reality skews liberal. He even goes so far as to deny that there are examples where liberals engage in the “overwhelming rejection of something that shouldn’t even be in dispute.” If what is being expressed is perceived to be the unvarnished truth, bias can’t be part of the equation.
Yale’s Dan Kahan, who was Klein’s primary interviewee in the referenced piece and an author of much of the relevant research, found Krugman’s view “amazingly funny,” in part because the research is so clear. Biased reasoning is in fact ideologically symmetrical. Continue reading
As regular readers are all too well aware, I am committed to data-driven analysis and investing. We’re suckers for stories, of course, and are ideological through-and-through, but the goal is to make sure that our investment decisions are based on real, quantitative evidence (at least to the extent possible).
That’s easier said than done, of course. We are prone to all sorts of cognitive and behavioral biases — perhaps most prominently confirmation bias — all of which threaten our analysis. We are also highly susceptible to bias blindness, the inability to see our own biases even when others’ are crystal clear. And now comes further evidence that our reasoning abilities are even worse than we thought. Continue reading