Over a career that has spanned four decades so far, concertgoers have routinely paid a lot of money to hear Phil Smith play the trumpet. The long-time principal trumpet of the New York Philharmonic retired this summer after 36 years in the orchestra. In his first professional audition, while still a student, he won a place in the Chicago Symphony. While still in his 20s, Phil came to New York following just his second professional audition. According to New Yorker magazine, “For the past thirty-six years, Smith has presided over orchestral trumpet playing, with a resonant, clarion sound and a reputation for never missing a note.” He has been, inarguably, one of the world’s great performers. Continue reading
Noah Smith (@Noahpinion on Twitter) made an interesting assertion yesterday about the purpose of argument. Smith began by noting Boston University economist Laurence Kotlikoff’s op-ed in Forbes in which he acts as a concern troll toward New York Times columnist (and noted economist himself) Paul Krugman because Krugman allegedly called Congressman Paul Ryan stupid. To be clear, Krugman’s primary point was not that Ryan is stupid, but that he is crooked, especially as it pertains to his budget proposals. Smith uses this context for looking at arguments in general, and he makes an excellent point.
[A]s a society, we use arguments the wrong way. We tend to treat arguments like debate competitions — two people argue in front of a crowd, and whoever wins gets the love and adoration of the crowd, and whoever loses goes home defeated and shamed. I guess that’s better than seeing arguments as threats of physical violence, but I still prefer the idea of arguing as a way to learn, to bounce ideas off of other people. Proving you’re smart is a pointless endeavor (unless you’re looking for a job), and is an example of what Stanford University psychologist Carol Dweck calls a “fixed mindset.” As the band Sparks once sang, “Everybody’s stupid — that’s for sure” [even though nobody wants to be called stupid]. What matters is going in the right direction — becoming less stupid, little by little.
But I think Smith’s ideal isn’t all that practical. To begin with, as Megan McArdle emphasizes, by calling one who disagrees with you stupid (even implicitly) “you have guaranteed that no one who disagrees with you will hear a word that you are saying.” Thus “calling people stupid is simply a performance for the fellow travelers in your audience” as well as a means of asserting superiority.
My sense is that the key element to this discussion is that most partisans see “their side” as not just true, but obviously true. It’s a by-product of bias blindness, or selective perception. We tend to see bias in others but not in ourselves. Therefore, our strongly held positions aren’t really debatable — they’re objectively and obviously true. After all, if we didn’t think our positions were true, we wouldn’t hold them. And (our thinking goes) since they are objectively true, anyone who makes the effort to try should be able to ascertain that truth. Our opponents are thus without excuse. Continue reading
If you were wrong about something important, how quickly would you want to know and how quickly would you want to do something about it? Unfortunately, the answer isn’t nearly as obvious as we’d like to think.
Mark Twain sagely noted that a lie can travel half way around the world while the truth is putting on its shoes. That’s because the truth is so much messier. Lies are created to be believable. They cater to our prejudices, whims, desires and hopes even when the truth cannot. Lies offer a good story when the truth does not. They are plausible when the truth is not. We often resist and even deny the truth. It is inherently unwieldy. It requires a careful sifting and analysis of facts in order to be discerned — we want deduction but are limited to induction most of the time. The truth is simply very hard to handle.
Of course, if we’re talking about relatively trivial matters (perhaps the distance from the earth to the moon) or about something we’re predisposed to believe anyway, we adjust our beliefs quite readily. But when truth doesn’t fit with what is important to us — when it matters — our perception of it gets caught up in who we perceive ourselves to be and in our vested interests. In those instances, attacking false information with data and related evidence often backfires, having the opposite of the desired effect. We like to think that straightforward education overcomes falsehoods, but things aren’t nearly that simple. This horrifying phenomenon — the backfire effect — was demonstrated once again recently in a study of the responses of parents to various forms of reporting that vaccines are not dangerous. Continue reading
It seems to me, after a good deal of thought, reflection and research, that we have so much difficulty dealing with behavioral and cognitive bias in large measure because we build our belief structures precisely backwards. There’s nothing revelatory in that conclusion, obviously, because it is exactly what confirmation bias is all about. We like to think that we make (at least relatively) objective decisions based upon the best available evidence. But the truth is that we are ideological through-and-through and thus tend to make our “decisions” first — based upon our pre-conceived notions — and then backfill to add some supportive reasoning (which need not be very good to be seen as convincing).
I have been working on an infographic to try to illustrate the issue* and have come up with the following.
The goal should be to build from the ground up — beginning with facts, working to conclusions and so on. Beliefs are interpretations of one’s conclusions about the facts. If more fervently held, they rise to the level of conviction and perhaps to the highest pyramid level, whereby one makes a major commitment to a particular cause, approach or ideology. These commitments are the things by which we tend to be defined. Continue reading
Ezra Klein (formerly of The Washington Post) has a new venture (Vox) dedicated to what he calls “explanatory journalism” and which offers consistently progressive “explanations” for various policies by a talented but ideologically pure staff. Klein’s big introductory think piece cites research (already familiar to regular readers here) showing that people understand the world in ways that suit their preexisting beliefs and ideological commitments. Thus in controlled experiments both conservatives and liberals systematically misread the facts in a way that confirms their biases.
Interestingly, if unsurprisingly, while Klein concedes the universality of the problem in theory, all of his examples point out the biased stupidity of his political opponents. Paul Krugman – a terrific economist but an often insufferable progressive shill – sees Klein’s bid and ups the ante, exhibiting classic bias blindness: “the lived experience is that this effect is not, in fact, symmetric between liberals and conservatives.” In other words, his “lived experience” trumps the research evidence (science at work!). In Krugman’s view, conservatives are simply much stupider than liberals because reality skews liberal. He even goes so far as to deny that there are examples where liberals engage in the “overwhelming rejection of something that shouldn’t even be in dispute.” If what is being expressed is perceived to be the unvarnished truth, bias can’t be part of the equation.
Yale’s Dan Kahan, who was Klein’s primary interviewee in the referenced piece and an author of much of the relevant research, found Krugman’s view “amazingly funny,” in part because the research is so clear. Biased reasoning is in fact ideologically symmetrical. Continue reading
Terrance Odean is the Rudd Family Foundation Professor of Finance at the Haas School of Business at the University of California, Berkeley. He is a member of the Journal of Investment Consulting editorial advisory board, of the Russell Sage Behavioral Economics Roundtable, and of the WU Gutmann Center Academic Advisory Board at the Vienna University of Economics and Business. He has been an editor and an associate editor of the Review of Financial Studies, an associate editor of the Journal of Finance, a co-editor of a special issue of Management Science, an associate editor at the Journal of Behavioral Finance, a director of UC Berkeley’s Experimental Social Science Laboratory, a visiting professor at the University of Stavanger, Norway, and the Willis H. Booth Professor of Finance and Banking and Chair of the Finance Group at the Haas School of Business. As an undergraduate at Berkeley, Odean studied Judgment and Decision Making with the 2002 Nobel Laureate in Economics, Daniel Kahneman. This led to his current research focus on how psychologically motivated decisions affect investor welfare and securities prices.
Today I ask (in bold) and Terry answers what I hope are Five Good Questions as part of my longstanding series by that name (see links below). Continue reading
On June 21, 1932, after Max Schmeling lost his heavyweight boxing title to Jack Sharkey on a controversial split-decision, his manager Joe Jacobs famously intoned, “We was robbed.” It’s a conviction that hits home with every fan of a losing team and thus every sports fan a lot of the time. It’s also a point of view that has received a surprising amount of academic interest and study (note, for example, this famous 1954 paper arising out of a Dartmouth v. Princeton football game).
Traditional economic theory insists that we humans are rational actors making rational decisions amidst uncertainty in order to maximize our marginal utility. As if. We are remarkably crazy a lot of the time.
Investment Belief #3: We aren’t nearly as rational as we assume
Traditional economic theory insists that we humans are rational actors making rational decisions amidst uncertainty in order to maximize our marginal utility. Sometimes we even try to believe it. But we aren’t nearly as rational as we tend to assume. We frequently delude ourselves and are readily manipulated – a fact that the advertising industry is eager to exploit.1
Watch Mad Men‘s Don Draper (Jon Hamm) use the emotional power of words to sell a couple of Kodak executives on himself and his firm while turning what they perceive to be a technological achievement (the “wheel”) into something much richer and more compelling – the “carousel.”
Those Kodak guys will hire Draper, of course, but their decision-making will hardly be rational. Homo economicus is thus a myth. But, of course, we already knew that. Even young and inexperienced investors can recognize that after just a brief exposure to the real world markets. The “rational man” is as non-existent as the Loch Ness Monster, Bigfoot and (perhaps) moderate Republicans. Yet the idea that we’re essentially rational creatures is a very seductive myth, especially as and when we relate the concept to ourselves (few lose money preying on another’s ego). We love to think that we’re rational actors carefully examining and weighing the available evidence in order to reach the best possible conclusions.
Oh that it were so. If we aren’t really careful, we will remain deluded that we see things as they really are. The truth is that we see things the way we really are. I frequently note that investing successfully is very difficult. And so it is. But the reasons why that is so go well beyond the technical aspects of investing. Sometimes it is retaining honesty, lucidity and simplicity – seeing what is really there – that is what’s so hard. Continue reading
Nearly every high school choral organization routinely performs anthems based upon some version of a familiar trope. The piece is designed to be trendy musically, even while being more than a bit late (when I was in school, each had an obligatory “hard rock” section). Meanwhile, the lyrics are an earnest and perhaps cloying ode to the ability of the young to create a better and brighter tomorrow. One such title from my school days was in fact “Hope for the Future.”
Unfortunately, the promise always seems better than the execution.
Despite the enormous (and most often negative) impact that our behavioral and cognitive biases have on our thinking and decision-making, the prevailing view is that we can’t do very much about them. In his famous 1974 Cal Tech commencement address, the great physicist Richard Feynman emphasized the importance of getting the real scoop about things, but lamented how hard it can be to accomplish. “The first principle is that you must not fool yourself – and you are the easiest person to fool.” Even Daniel Kahneman, Nobel laureate, the world’s leading authority on this subject and probably the world’s greatest psychologist, has concluded that we can’t do much to help ourselves in this regard.
But today — maybe — there might just be a tiny glimmer of hope (for the future). Continue reading
My December Research magazine column is now available online. Here’s a taste.
Innovation in financial planning typically starts with an idea. If enough people (or the right people) think it might be a good idea, it then moves to evidence-gathering for confirmation. But the entire endeavor—designed to try to confirm if the idea makes sense—is inherently prone to confirmation bias. We should be systematically and consistently looking to disprove the idea. Without a devil’s advocate with the specific mission to try to show why the idea is a bad one, without recrimination or criticism for doing so, many bad ideas will seem to be confirmed.
I hope you will read the whole thing.