Bias Blindness Explained

Earlier today, I posted the following on Twitter, whereby I offered a 140 character explanation of bias blindness.

Jason Zweig, the wonderful columnist for The Wall Street Journal, added the following (and I agreed).

As Jason explained via email, “conscious control of unconscious bias is impossible…on an ad-hoc basis.”  Our best hope, we agree, is by instituting careful, blanket policies and procedures to combat them.

As Daniel Kahneman also argues, organizations are more likely to succeed at overcoming bias than individuals. That’s partly on account of resources, and partly because self-criticism is so difficult. As I have argued repeatedly, perhaps the best check on bad decision-making we have is when someone (or, when possible, an empowered team) we respect sets out to show us where and how we are wrong. Within an organization that means making sure that everyone can be challenged without fear of reprisal and that everyone (and especially anyone in charge) can be and is held accountable.

But that doesn’t happen very often. Kahneman routinely asks groups how committed they are to better decision-making and if they are willing to spend even one percent of their budgets on doing so. Sadly, as far as I know, he hasn’t had any takers yet. Smart companies and individuals will take him up on that challenge. Those that are smarter will do even more simply because there’s no substitute for good judgment.

Advertisement

What you’re vacillatin’ between

Francis Paschal

Francis Paschal

When I was a first-year law student at Duke many years ago, my Civil Procedure professor was the delightfully named J. Francis Paschal. Professor Paschal seemed to like to portray himself as a bit of a good ol’ boy, with a protruding gut, truly dreadful sports jackets, hair slicked and parted just off-center, and a drawl as thick as molasses on a cold day (if not nearly so sweet). That image could not mask a keen mind and a sharp wit. Nor did it hide his erudition — in addition to his credentials in the law, Professor Paschal had a Princeton Ph.D. too.

The good professor led his classes using the Socratic conventions of the day. A student was called upon to answer a series of penetrating and perplexing questions supposedly designed to ferret out the nuances of some legal principle or another but which, in reality, served to demonstrate to a class full of bright and full-of-themselves college graduates that they were out of the minors and into the intellectual big leagues. If we were going to compete at that level, we needed to up our collective game considerably.

One day fairly early in the first semester Professor Paschal called on a woman in the row ahead of me (who I shall kindly refer to — using a pseudonym since she is now a Deputy Attorney General — as “Frieda Clancy”) and asked a typically impossible question. SInce Frieda was a friend, I happened to know that her extremely difficult predicament was actually utterly impossible because she was not prepared for class.  In fact, it wasn’t just that she wasn’t fully prepared (meaning that she had read the required case, all the cases cited therein, the case comments, casebook notes and citations, relevent hornbook and law review materials and anything else we could think of that might be relevant). She wasn’t prepared at all. She hadn’t even read the case at issue.

This was not likely to turn out well. Continue reading

There’s No Substitute for Good Judgment

Source: The Economist

Source: The Economist

“P.T. Barnum was right.”

So says Commander Lyle Tiberius Rourke in the Disney film Atlantis: The Lost Empire, referring to the famous expression attributed to the great American showman: “There’s a sucker born every minute.” Even though Barnum didn’t say it, we get it. In talking about the scientific method in his famous 1974 Cal Tech commencement address, Nobel laureate Richard Feynman emphasized the point: “The first principle is that you must not fool yourself – and you are the easiest person to fool.”

Accordingly, we’re right to be skeptical about our decision-making abilities in general because our beliefs, judgments and choices are so frequently wrong. That is to say that they are mathematically in error, logically flawed, inconsistent with objective reality, or some combination thereof, largely on account of our behavioral and cognitive biases. Our intuition is simply not to be trusted.

Part of the problem is (as it so often is) explained by Nobel laureate Daniel Kahneman: “A remarkable aspect of your mental life is that you are rarely stumped. … you often have [supposed] answers to questions that you do not completely understand, relying on evidence that you can neither explain nor defend.” We thus jump to conclusions quickly – far too quickly – and without a proper basis.

We aren’t stupid, of course (or at least entirely stupid). Yet even the smartest, most sophisticated and most perceptive among us make such mistakes and make them repeatedly and predictably. That predictability, together with our innate intelligence, offers at least some hope that we can do something meaningful to counteract the problems.

One appropriate response to our difficulties in this area is to create a carefully designed and data-driven investment process with fewer imbedded decisions. When decision-making is risky business, it makes sense to limit the number of decisions that need to be made. For example, it makes sense to use a variety of screens for sorting prospective investments and to make sure that such investments meet certain criteria before we put our money to work. An Investment Policy Statement outlining limits to investment and thereby limiting the number of decisions to be made and even further limiting the number of decisions to be made quickly is imperative.

It’s even tempting to try to create a fully “automated” system. However, the idea that we can (or should) weed-out human judgment entirely is silly. Choices about how to create one’s investment process must be made and somebody (or, better yet, a group of somebodies*) will have to make them. Moreover, a process built to be devoid of human judgment runs grave risks of its own.

Take the case of Adrionna Harris, a sixth grader in Virginia Beach, Virginia, for example. Continue reading

We Are Less Than Rational

Investment Belief #3: We aren’t nearly as rational as we assume

InvestmentBeliefssm2 (2)Traditional economic theory insists that we humans are rational actors making rational decisions amidst uncertainty in order to maximize our marginal utility. Sometimes we even try to believe it.  But we aren’t nearly as rational as we tend to assume. We frequently delude ourselves and are readily manipulated – a fact that the advertising industry is eager to exploit.1

Watch Mad Men‘s Don Draper (Jon Hamm) use the emotional power of words to sell a couple of Kodak executives on himself and his firm while turning what they perceive to be a technological achievement (the “wheel”) into something much richer and more compelling – the “carousel.”

Those Kodak guys will hire Draper, of course, but their decision-making will hardly be rational. Homo economicus is thus a myth. But, of course, we already knew that. Even young and inexperienced investors can recognize that after just a brief exposure to the real world markets. The “rational man” is as non-existent as the Loch Ness Monster, Bigfoot and (perhaps) moderate Republicans.  Yet the idea that we’re essentially rational creatures is a very seductive myth, especially as and when we relate the concept to ourselves (few lose money preying on another’s ego). We love to think that we’re rational actors carefully examining and weighing the available evidence in order to reach the best possible conclusions.

Oh that it were so. If we aren’t really careful, we will remain deluded that we see things as they really are. The truth is that we see things the way we really are. I frequently note that investing successfully is very difficult. And so it is. But the reasons why that is so go well beyond the technical aspects of investing. Sometimes it is retaining honesty, lucidity and simplicity – seeing what is really there – that is what’s so hard. Continue reading

Hope for the Future

HopeFutureNearly every high school choral organization routinely performs anthems based upon some version of a familiar trope. The piece is designed to be trendy musically, even while being more than a bit late (when I was in school, each had an obligatory “hard rock” section). Meanwhile, the lyrics are an earnest and perhaps cloying ode to the ability of the young to create a better and brighter tomorrow. One such title from my school days was in fact “Hope for the Future.”

Unfortunately, the promise always seems better than the execution.

Despite the enormous (and most often negative) impact that our behavioral and cognitive biases have on our thinking and decision-making, the prevailing view is that we can’t do very much about them. In his famous 1974 Cal Tech commencement address, the great physicist Richard Feynman emphasized the importance of getting the real scoop about things, but lamented how hard it can be to accomplish.  “The first principle is that you must not fool yourself – and you are the easiest person to fool.” Even Daniel Kahneman, Nobel laureate, the world’s leading authority on this subject and probably the world’s greatest psychologist, has concluded that we can’t do much to help ourselves in this regard.

But today — maybe — there might just be a tiny glimmer of hope (for the future). Continue reading

A Critical Omission

Barry Ritholtz linked a fine video today on critical thinking.

Of course, precious few of us exercise truly independent thought very often and none of us does so nearly often enough. As no less an expert than Daniel Kahneman acknowledges, making consistently good choices based upon good reasoning is really hard. Training ourselves to do so more intuitively is even harder. The academic research is crystal clear on that point.  That’s partly because it takes lots of practice and we’re lazy and partly because it isn’t just a skill to be learned. And that’s the critical point that the video leaves out.

Productive critical thinking requires adequate domain knowledge to go along with lots of practice.  Most personal and far too many professional investors have neither. Critical thinking can’t be effectively undertaken in investing or anywhere else unless and until there is sufficient knowledge of the subject matter.  One’s knowledge base provides the foundation of and context for engaging in critical thinking. Critical thinking alone is never enough.

Paradigm Shifting

I grew up in the investment business in the early 1990s on the ginormous fixed income trading floor at the then Merrill Lynch in the World Financial Center in New York City.  It was a culture of trading and instant gratification. “What have you done for me lately?” covered no more than that day, often less. I frequently argued that we should consider the interests of our customers – among of the biggest investment managers, pension funds and insurance companies in the world – in our dealings so as to build long-term relationships and enhance longer-term profitability even if it meant making a little less in the near-term.

I got nowhere.

Liar's PokerWe called our accounts “clients” to their faces, and I thought of them that way, but they were customers at best and marks at worst to virtually everyone on the floor, and especially to our managers.  We didn’t call them “muppets” like the guys (and it was almost all guys then) came to at Goldman Sachs, but we may as well have. Our mission was clear.  We were to make as much money for the firm as we could as quickly as we could. Lunch was a long-range plan.  In his first and funniest book, Liar’s Poker, about our counterparts at Salomon Brothers, Michael Lewis got the tone, the approach and the atmosphere precisely right. Continue reading

The Tragedy of Errors

Lawn Chair LarryLarry Walters had always wanted to fly.  When he was old enough, he joined the Air Force, but his poor eyesight wouldn’t allow him to become a pilot. After he was discharged from the military, he would often sit in his backyard watching jets fly overhead, dreaming about flying and scheming about how to get into the sky. On July 2, 1982, the San Pedro, California trucker finally set out to accomplish his dream. But things didn’t turn out exactly as he planned.

Larry conceived his project while sitting outside in his “extremely comfortable” Sears lawn chair. He purchased weather balloons from an Army-Navy surplus store, tied them to his tethered Sears chair and filled the four-foot diameter balloons with helium. Then, after packing sandwiches, Miller Lite, a CB radio, a camera and a pellet gun, he strapped himself into his lawn chair (see above). His plan, such as it was, called for his floating lazily above the rooftops at about 30 feet for a while and then using the pellet gun to explode the balloons one-by-one so he could float to the ground.

But when his friends cut the cords that tethered the lawn chair to his Jeep, Walters and his lawn chair didn’t rise lazily. Larry shot up to a height of over 15,000 feet, yanked by the lift of 45 helium balloons holding 33 cubic feet of helium each.  He did not dare shoot any balloons, fearing that he might unbalance the load and cause a fall.  So he slowly drifted along, cold and frightened, with his beer and sandwiches, for more than 14 hours. He eventually crossed the primary approach corridor of LAX.  A flustered TWA pilot spotted Larry and radioed the tower that he was passing a guy in a lawn chair at 16,000 feet.

Eventually Larry conjured up the nerve to shoot several balloons before accidentally dropping his pellet gun overboard. The shooting did the trick and Larry descended toward Long Beach, until the dangling tethers got caught in a power line, causing an electrical blackout in the neighborhood below. Fortunately, Walters was able to climb to the ground safely from there.

The Long Beach Police Department and federal authorities were waiting. Regional safety inspector Neal Savoy said, “We know he broke some part of the Federal Aviation Act, and as soon as we decide which part it is, some type of charge will be filed. If he had a pilot’s license, we’d suspend that. But he doesn’t.” As he was led away in handcuffs, a reporter asked Larry why he had undertaken his mission. The answer was simple and poignant. “A man can’t just sit around.” Continue reading

Framing

InvestmentABCs1According to the Major League Baseball Rulebook, Rule 2.00:

“The strike zone is that area over home plate, the upper limit of which is a horizontal line at the midpoint between the top of the shoulders and the top of the uniform pants, and the lower level is a line at the hollow beneath the kneecap. The strike zone shall be determined from the batter’s stance as the batter is prepared to swing at a pitched ball.”

But what looks at least reasonably clear on paper is anything but in practice.  Indeed, far from every strike called meets the above criteria (not that this is news to any baseball fan). This reality is based — in no small measure — upon how each pitch is “framed” by the catcher. Continue reading

Explaining the Value Premium

Planning FallacyValue has persistently outperformed over the long-term.  Why is that? 

In the most general terms, growth stocks are those with growing positive attributes – like price, sales, earnings, profits, and return on equity.  Value stocks, on the other hand, are stocks that are underpriced when compared to some measure of their relative value – like price to earnings, price to book, and dividend yield. Thus growth stocks trade at higher prices relative to various fundamental measures of their value because (at least in theory) the market is pricing in the potential for future earnings growth. Over relatively long periods of time, each of these investing classes can and do outperform the other.  For example, growth investing dominated the 1990s while value investing has outperformed since. But value wins over the long haul.

Continue reading