The Backfire Effect

InvestmentABCs1If you were wrong about something important, how quickly would you want to know and how quickly would you want to do something about it? Unfortunately, the answer isn’t nearly as obvious as we’d like to think.

Mark Twain sagely noted that a lie can travel half way around the world while the truth is putting on its shoes. That’s because the truth is so much messier. Lies are created to be believable. They cater to our prejudices, whims, desires and hopes even when the truth cannot. Lies offer a good story when the truth does not. They are plausible when the truth is not. We often resist and even deny the truth. It is inherently unwieldy. It requires a careful sifting and analysis of facts in order to be discerned — we want deduction but are limited to induction most of the time. The truth is simply very hard to handle.

Of course, if we’re talking about relatively trivial matters (perhaps the distance from the earth to the moon) or about something we’re predisposed to believe anyway, we adjust our beliefs quite readily. But when truth doesn’t fit with what is important to us — when it matters — our perception of it gets caught up in who we perceive ourselves to be and in our vested interests. In those instances, attacking false information with data and related evidence often backfires, having the opposite of the desired effect. We like to think that straightforward education overcomes falsehoods, but things aren’t nearly that simple. This horrifying phenomenon — the backfire effect — was demonstrated once again recently in a study of the responses of parents to various forms of reporting that vaccines are not dangerous. Continue reading

A Commitment to Truth

InvestmentBeliefssm2 (2)It seems to me, after a good deal of thought, reflection and research, that we have so much difficulty dealing with behavioral and cognitive bias in large measure because we build our belief structures precisely backwards. There’s nothing revelatory in that conclusion, obviously, because it is exactly what confirmation bias is all about. We like to think that we make (at least relatively) objective decisions based upon the best available evidence. But the truth is that we are ideological through-and-through and thus tend to make our “decisions” first — based upon our pre-conceived notions — and then backfill to add some supportive reasoning (which need not be very good to be seen as convincing).

I have been working on an infographic to try to illustrate the issue* and have come up with the following.

Commitment Hierarchy

The goal should be to build from the ground up — beginning with facts, working to conclusions and so on. Beliefs are interpretations of one’s conclusions about the facts. If more fervently held, they rise to the level of conviction and perhaps to the highest pyramid level, whereby one makes a major commitment to a particular cause, approach or ideology. These commitments are the things by which we tend to be defined.  Continue reading

Laugh or Cry?

Our biases make it really hard to see things clearly.

Our biases make it really hard to see things clearly.

Ezra Klein (formerly of The Washington Post) has a new venture (Vox) dedicated to what he calls “explanatory journalism” and which offers consistently progressive “explanations” for various policies by a talented but ideologically pure staff. Klein’s big introductory think piece cites research (already familiar to regular readers here) showing that people understand the world in ways that suit their preexisting beliefs and ideological commitments. Thus in controlled experiments both conservatives and liberals systematically misread the facts in a way that confirms their biases.

Interestingly, if unsurprisingly, while Klein concedes the universality of the problem in theory, all of his examples point out the biased stupidity of his political opponents. Paul Krugman – a terrific economist but an often insufferable progressive shill – sees Klein’s bid and ups the ante, exhibiting classic bias blindness: “the lived experience is that this effect is not, in fact, symmetric between liberals and conservatives.” In other words, his “lived experience” trumps the research evidence (science at work!). In Krugman’s view, conservatives are simply much stupider than liberals because reality skews liberal. He even goes so far as to deny that there are examples where liberals engage in the “overwhelming rejection of something that shouldn’t even be in dispute.” If what is being expressed is perceived to be the unvarnished truth, bias can’t be part of the equation.

Yale’s Dan Kahan, who was Klein’s primary interviewee in the referenced piece and an author of much of the relevant research, found Krugman’s view “amazingly funny,” in part because the research is so clear. Biased reasoning is in fact ideologically symmetrical. Continue reading

Five Good Questions with Terry Odean

5 Good QuestionsTerrance Odean is the Rudd Family Foundation Professor of Finance at the Haas School of Business at the University of California, Berkeley. He is a member of the Journal of Investment Consulting editorial advisory board, of the Russell Sage Behavioral Economics Roundtable, and of the WU Gutmann Center Academic Advisory Board at the Vienna University of Economics and Business. He has been an editor and an associate editor of the Review of Financial Studies, an associate editor of the Journal of Finance, a co-editor of a special issue of Management Science, an associate editor at the Journal of Behavioral Finance, a director of UC Berkeley’s Experimental Social Science Laboratory, a visiting professor at the University of Stavanger, Norway, and the Willis H. Booth Professor of Finance and Banking and Chair of the Finance Group at the Haas School of Business. As an undergraduate at Berkeley, Odean studied Judgment and Decision Making with the 2002 Nobel Laureate in Economics, Daniel Kahneman. This led to his current research focus on how psychologically motivated decisions affect investor welfare and securities prices.

Today I ask (in bold) and Terry answers what I hope are Five Good Questions as part of my longstanding series by that name (see links below). Continue reading

We Was Robbed

Worst Call EverOn June 21, 1932, after Max Schmeling lost his heavyweight boxing title to Jack Sharkey on a controversial split-decision, his manager Joe Jacobs famously intoned, “We was robbed.” It’s a conviction that hits home with every fan of a losing team and thus every sports fan a lot of the time. It’s also a point of view that has received a surprising amount of academic interest and study (note, for example, this famous 1954 paper arising out of a Dartmouth v. Princeton football game).

Traditional economic theory insists that we humans are rational actors making rational decisions amidst uncertainty in order to maximize our marginal utility. As if. We are remarkably crazy a lot of the time.

Continue reading

We Are Less Than Rational

Investment Belief #3: We aren’t nearly as rational as we assume

InvestmentBeliefssm2 (2)Traditional economic theory insists that we humans are rational actors making rational decisions amidst uncertainty in order to maximize our marginal utility. Sometimes we even try to believe it.  But we aren’t nearly as rational as we tend to assume. We frequently delude ourselves and are readily manipulated – a fact that the advertising industry is eager to exploit.1

Watch Mad Men‘s Don Draper (Jon Hamm) use the emotional power of words to sell a couple of Kodak executives on himself and his firm while turning what they perceive to be a technological achievement (the “wheel”) into something much richer and more compelling – the “carousel.”

Those Kodak guys will hire Draper, of course, but their decision-making will hardly be rational. Homo economicus is thus a myth. But, of course, we already knew that. Even young and inexperienced investors can recognize that after just a brief exposure to the real world markets. The “rational man” is as non-existent as the Loch Ness Monster, Bigfoot and (perhaps) moderate Republicans.  Yet the idea that we’re essentially rational creatures is a very seductive myth, especially as and when we relate the concept to ourselves (few lose money preying on another’s ego). We love to think that we’re rational actors carefully examining and weighing the available evidence in order to reach the best possible conclusions.

Oh that it were so. If we aren’t really careful, we will remain deluded that we see things as they really are. The truth is that we see things the way we really are. I frequently note that investing successfully is very difficult. And so it is. But the reasons why that is so go well beyond the technical aspects of investing. Sometimes it is retaining honesty, lucidity and simplicity – seeing what is really there – that is what’s so hard. Continue reading

Hope for the Future

HopeFutureNearly every high school choral organization routinely performs anthems based upon some version of a familiar trope. The piece is designed to be trendy musically, even while being more than a bit late (when I was in school, each had an obligatory “hard rock” section). Meanwhile, the lyrics are an earnest and perhaps cloying ode to the ability of the young to create a better and brighter tomorrow. One such title from my school days was in fact “Hope for the Future.”

Unfortunately, the promise always seems better than the execution.

Despite the enormous (and most often negative) impact that our behavioral and cognitive biases have on our thinking and decision-making, the prevailing view is that we can’t do very much about them. In his famous 1974 Cal Tech commencement address, the great physicist Richard Feynman emphasized the importance of getting the real scoop about things, but lamented how hard it can be to accomplish.  “The first principle is that you must not fool yourself – and you are the easiest person to fool.” Even Daniel Kahneman, Nobel laureate, the world’s leading authority on this subject and probably the world’s greatest psychologist, has concluded that we can’t do much to help ourselves in this regard.

But today — maybe — there might just be a tiny glimmer of hope (for the future). Continue reading

Who’s the Easiest Person to Fool?

RES_1213_AnnuityAnalytics_MI600-resize-600x338My December Research magazine column is now available online. Here’s a taste.

Innovation in financial planning typically starts with an idea. If enough people (or the right people) think it might be a good idea, it then moves to evidence-gathering for confirmation. But the entire endeavor—designed to try to confirm if the idea makes sense—is inherently prone to confirmation bias. We should be systematically and consistently looking to disprove the idea. Without a devil’s advocate with the specific mission to try to show why the idea is a bad one, without recrimination or criticism for doing so, many bad ideas will seem to be confirmed.

I hope you will read the whole thing.

Who’s the Easiest Person to Fool?

Too Sure By Half

Nobel economistsAll nine of this year’s American Nobel Prize laureates shared a stage this week. Not surprisingly, economists Robert Shiller and Eugene Fama sniped at each other, if good-naturedly. But it was the winners from the harder sciences who got the best digs in at the economists. Martin Karplus, a chemist from Harvard, queried, “What understanding of the stock market do you really have?” He even went so far as to note that economics – “if one wants to call it a science” – seemed completely unable to explain market movement. There were others, but you get the point.

The obvious takeaway is a common theme – that economics is conflicted flawed divided wildly inconsistent worthless, a point that has been made by Nassim Taleb, Jeremy Grantham and many others. The idea is that economics desperately wants to be a science, with “physics envy,” but simply isn’t up to the task. And it’s hard to contest the point.

My disagreement is not so much with the main point (even though I think academic economics has a good deal to offer, with Shiller’s work being Exhibit A in my evidence cache), but with its corollary and the casual certainty with which that corollary is expressed.  “Hard” scientists increasingly express a belief that their domain is the only bastion of reality available. They see a great divide between things we can know (science) and everything else, which is opinion at best and largely worthless. That the expression of it is so often humorless and irony-challenged makes it all the worse. Continue reading

The Wyatt Earp Effect

The Big PictureMy first post for The Big Picture, the wonderful blog from Barry Ritholtz, also of The Washington Post and Bloomberg View, is now up. You may read it here. I hope you will.

The Wyatt Earp Effect