When making his defense of some British soldiers during the Boston Massacre trials in December of 1770, John Adams (later the second President of the United States) offered a famous insight. “Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of facts and evidence.” Legal Papers of John Adams, 3:269. In a similar vein, Sen. Daniel Patrick Moynihan once said that “[e]veryone is entitled to his own opinion, but not to his own facts.”
I have often warned about our proclivity to and preference for stories to the exclusion of data (for example, here, here and here). Because stories are so powerful, we want the facts to be neatly packaged into a compelling narrative. Take a look at John Boswell‘s delightful send-up of this technique in the TED context below.
When I was a first-year law student at Duke many years ago, my Civil Procedure professor was the delightfully named J. Francis Paschal. Professor Paschal seemed to like to portray himself as a bit of a good ol’ boy, with a protruding gut, truly dreadful sports jackets, hair slicked and parted just off-center, and a drawl as thick as molasses on a cold day (if not nearly so sweet). That image could not mask a keen mind and a sharp wit. Nor did it hide his erudition — in addition to his credentials in the law, Professor Paschal had a Princeton Ph.D. too.
The good professor led his classes using the Socratic conventions of the day. A student was called upon to answer a series of penetrating and perplexing questions supposedly designed to ferret out the nuances of some legal principle or another but which, in reality, served to demonstrate to a class full of bright and full-of-themselves college graduates that they were out of the minors and into the intellectual big leagues. If we were going to compete at that level, we needed to up our collective game considerably.
One day fairly early in the first semester Professor Paschal called on a woman in the row ahead of me (who I shall kindly refer to — using a pseudonym since she is now a Deputy Attorney General — as “Frieda Clancy”) and asked a typically impossible question. SInce Frieda was a friend, I happened to know that her extremely difficult predicament was actually utterly impossible because she was not prepared for class. In fact, it wasn’t just that she wasn’t fully prepared (meaning that she had read the required case, all the cases cited therein, the case comments, casebook notes and citations, relevent hornbook and law review materials and anything else we could think of that might be relevant). She wasn’t prepared at all. She hadn’t even read the case at issue.
Instead, the legacy of that brisk late autumn afternoon contest rests upon two seemingly unrelated matters: allegations of intentionally dirty play by Dartmouth and our inability to perceive reality with any degree of objective accuracy, especially where we have a major emotional investment. Based upon various sources, the primary narrative from the game is that Dartmouth set out to injure Princeton players – particularly Kazmaier – and that after the Princeton star was injured and forced from the game in the second quarter, matters turned increasingly fractious. But that wasn’t the only proffered narrative. Continue reading →
As I have often argued, we like to think that we see the world as it truly is. Instead, we tend to see the world as we really are. Sadly, that reality is both literal and figurative.
As Naturehas reported, if you take a look at the gif below, you’ll discover some really weird stuff about yourself and your brain.
Source: Prof. Michael Bach, University of Freiburg
Look at any of the yellow dots as the figure moves; it remains present and stationary. If you concentrate on all three yellow dots, they remain in place too. But if you concentrate on the central green dot instead, one or more of the yellow dots will seem to disappear and then reappear intermittently even though they are really there the whole time. Your brain simply doesn’t register their presence sometimes. This optical illusion, called motion-induced blindness, applies to nearly everyone. Continue reading →
Over a career that has spanned four decades so far, concertgoers have routinely paid a lot of money to hear Phil Smith play the trumpet. The long-time principal trumpet of the New York Philharmonic retired this summer after 36 years in the orchestra. In his first professional audition, while still a student, he won a place in the Chicago Symphony. While still in his 20s, Phil came to New York following just his second professional audition. According to New Yorker magazine, “For the past thirty-six years, Smith has presided over orchestral trumpet playing, with a resonant, clarion sound and a reputation for never missing a note.” He has been, inarguably, one of the world’s great performers. Continue reading →
Noah Smith (@Noahpinion on Twitter) made an interesting assertion yesterday about the purpose of argument. Smith began by noting Boston University economist Laurence Kotlikoff’s op-ed in Forbes in which he acts as a concern troll toward New York Times columnist (and noted economist himself) Paul Krugman because Krugman allegedly called Congressman Paul Ryan stupid. To be clear, Krugman’s primary point was not that Ryan is stupid, but that he is crooked, especially as it pertains to his budget proposals. Smith uses this context for looking at arguments in general, and he makes an excellent point.
[A]s a society, we use arguments the wrong way. We tend to treat arguments like debate competitions — two people argue in front of a crowd, and whoever wins gets the love and adoration of the crowd, and whoever loses goes home defeated and shamed. I guess that’s better than seeing arguments as threats of physical violence, but I still prefer the idea of arguing as a way to learn, to bounce ideas off of other people. Proving you’re smart is a pointless endeavor (unless you’re looking for a job), and is an example of what Stanford University psychologist Carol Dweck calls a “fixed mindset.” As the band Sparks once sang, “Everybody’s stupid — that’s for sure” [even though nobody wants to be called stupid]. What matters is going in the right direction — becoming less stupid, little by little.
But I think Smith’s ideal isn’t all that practical. To begin with, as Megan McArdle emphasizes, by calling one who disagrees with you stupid (even implicitly) “you have guaranteed that no one who disagrees with you will hear a word that you are saying.” Thus “calling people stupid is simply a performance for the fellow travelers in your audience” as well as a means of asserting superiority.
My sense is that the key element to this discussion is that most partisans see “their side” as not just true, but obviously true. It’s a by-product of bias blindness, or selective perception. We tend to see bias in others but not in ourselves. Therefore, our strongly held positions aren’t really debatable — they’re objectively and obviously true. After all, if we didn’t think our positions were true, we wouldn’t hold them. And (our thinking goes) since they are objectively true, anyone who makes the effort to try should be able to ascertain that truth. Our opponents are thus without excuse. Continue reading →
If you were wrong about something important, how quickly would you want to know and how quickly would you want to do something about it? Unfortunately, the answer isn’t nearly as obvious as we’d like to think.
Mark Twain sagely noted that a lie can travel half way around the world while the truth is putting on its shoes. That’s because the truth is so much messier. Lies are created to be believable. They cater to our prejudices, whims, desires and hopes even when the truth cannot. Lies offer a good story when the truth does not. They are plausible when the truth is not. We often resist and even deny the truth. It is inherently unwieldy. It requires a careful sifting and analysis of facts in order to be discerned — we want deduction but are limited to induction most of the time. The truth is simply very hard to handle.
Of course, if we’re talking about relatively trivial matters (perhaps the distance from the earth to the moon) or about something we’re predisposed to believe anyway, we adjust our beliefs quite readily. But when truth doesn’t fit with what is important to us — when it matters — our perception of it gets caught up in who we perceive ourselves to be and in our vested interests. In those instances, attacking false information with data and related evidence often backfires, having the opposite of the desired effect. We like to think that straightforward education overcomes falsehoods, but things aren’t nearly that simple. This horrifying phenomenon — the backfire effect — was demonstrated once again recently in a study of the responses of parents to various forms of reporting that vaccines are not dangerous. Continue reading →
It seems to me, after a good deal of thought, reflection and research, that we have so much difficulty dealing with behavioral and cognitive bias in large measure because we build our belief structures precisely backwards. There’s nothing revelatory in that conclusion, obviously, because it is exactly what confirmation bias is all about. We like to think that we make (at least relatively) objective decisions based upon the best available evidence. But the truth is that we are ideological through-and-through and thus tend to make our “decisions” first — based upon our pre-conceived notions — and then backfill to add some supportive reasoning (which need not be very good to be seen as convincing).
I have been working on an infographic to try to illustrate the issue* and have come up with the following.
The goal should be to build from the ground up — beginning with facts, working to conclusions and so on. Beliefs are interpretations of one’s conclusions about the facts. If more fervently held, they rise to the level of conviction and perhaps to the highest pyramid level, whereby one makes a major commitment to a particular cause, approach or ideology. These commitments are the things by which we tend to be defined. Continue reading →
Our biases make it really hard to see things clearly.
Ezra Klein (formerly of The Washington Post) has a new venture (Vox) dedicated to what he calls “explanatory journalism” and which offers consistently progressive “explanations” for various policies by a talented but ideologically pure staff. Klein’s big introductory think piece cites research (already familiar to regular readers here) showing that people understand the world in ways that suit their preexisting beliefs and ideological commitments. Thus in controlled experiments both conservatives and liberals systematically misread the facts in a way that confirms their biases.
Interestingly, if unsurprisingly, while Klein concedes the universality of the problem in theory, all of his examples point out the biased stupidity of his political opponents. Paul Krugman – a terrific economist but an often insufferable progressive shill – sees Klein’s bid and ups the ante, exhibiting classic bias blindness: “the lived experience is that this effect is not, in fact, symmetric between liberals and conservatives.” In other words, his “lived experience” trumps the research evidence (science at work!). In Krugman’s view, conservatives are simply much stupider than liberals because reality skews liberal. He even goes so far as to deny that there are examples where liberals engage in the “overwhelming rejection of something that shouldn’t even be in dispute.” If what is being expressed is perceived to be the unvarnished truth, bias can’t be part of the equation.
Terrance Odean is the Rudd Family Foundation Professor of Finance at the Haas School of Business at the University of California, Berkeley. He is a member of the Journal of Investment Consulting editorial advisory board, of the Russell Sage Behavioral Economics Roundtable, and of the WU Gutmann Center Academic Advisory Board at the Vienna University of Economics and Business. He has been an editor and an associate editor of the Review of Financial Studies, an associate editor of the Journal of Finance, a co-editor of a special issue of Management Science, an associate editor at the Journal of Behavioral Finance, a director of UC Berkeley’s Experimental Social Science Laboratory, a visiting professor at the University of Stavanger, Norway, and the Willis H. Booth Professor of Finance and Banking and Chair of the Finance Group at the Haas School of Business. As an undergraduate at Berkeley, Odean studied Judgment and Decision Making with the 2002 Nobel Laureate in Economics, Daniel Kahneman. This led to his current research focus on how psychologically motivated decisions affect investor welfare and securities prices.
Today I ask (in bold) and Terry answers what I hope are Five Good Questions as part of my longstanding series by that name (see links below). Continue reading →