Education and Intelligence Can Make Things Worse

In his 1974 Cal Tech commencement address, the great physicist Richard Feynman talked about the scientific method as the best means to achieve progress.  Even so, notice what he emphasizes:  “The first principle is that you must not fool yourself–and you are the easiest person to fool.”

Feynman is right, but why? We all like to think we are careful and rational. Sadly, we’re not, largely because of the cognitive biases that beset us.

We like to think that we carefully gather and evaluate facts and data before rationally coming to any conclusion.  Instead, we tend to suffer from confirmation bias and thus reach a conclusion first.  Only thereafter do we gather facts and see those facts in such a way as to support our pre-conceived conclusion.  When it fits with our desired narrative, so much the better, because narratives are crucial to how we make sense of reality.  They help us to explain, understand and interpret the world around us.  They also give us a frame of reference we can use to remember the concepts we take them to represent.  Perhaps most significantly, we inherently prefer narrative to data — often to the detriment of our understanding.

We also suffer from general overconfidence. Evidence in experimental psychology suggests that most people overestimate their own abilities to complete objective tasks accurately. If you doubt the research, talk to fans of your favorite local sports team before the season and count the proportion of wildly optimistic scenarios painted.

Our self-serving bias bias may be even worse. That bias pushes us to see the world such that the good stuff that happens is my doing (“we had a great week of practice, worked hard and executed on Sunday”) while the bad stuff is always someone else’s fault (“It just wasn’t our night” or “we simply couldn’t catch a break” or “we would have won if the refereeing hadn’t been so awful”).

In his terrific book, Thinking, Fast and Slow, Nobel laureate Dan Kahneman outlines what he calls the “planning fallacy.” The planning fallacy is our tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits thereof.  It’s at least partly why we underestimate bad results. It’s why we think it won’t take us as long to accomplish something as it does. It’s why projects tend to cost more than we expect.  It’s why the results we achieve aren’t as good as we expect.  Most fundamentally, it causes us to over-estimate our ability to impact the future and future outcomes.

As I have pointed out before, one reason this problem is so acute in the investment business is the so-called “authorization imperative.”  Our plans and proposals must be approved by our clients and we have a stake in getting that approval. This dynamic leads to our tendency to understate risk and overstate potential.  Perhaps we see it as easier to get forgiveness than permission or perhaps it’s just a sales pitch.  Or maybe we have convinced ourselves that we’ve got everything covered (confirmation bias!). Either way, despite its strategic benefits, we run the risk of serious misrepresentation.

Our overarching problem is called the bias blind-spot, our inability to recognize that we suffer from the same cognitive distortions that plague other people. In other words, we can see that these biases are a big problem generally, just not for us. We’re especially good at spotting and pointing out the flaws of people we know.This common failing is well illustrated by some recent research.

Texting and driving is very dangerous. An experiment by Car and Driver showed that drivers who texted while driving were much more impaired than when they we driving drunk.  Per the magazine:   ”In our test, neither subject had any idea that using his phone would slow down his reaction time so much. Like most folks, they think they’re pretty good drivers. Our results prove otherwise, at both city and highway speeds.” Numerous studies of various types confirm this danger, which should be obvious. Indeed, new data released recently by the National Highway Traffic Safety Administration show 3,092 deaths from distraction-affected crashes in 2010 alone and many more injuries.

Yet despite the obvious risks, all the public criticism and new legal bans across the country, now effective in 35 states, texting by drivers just keeps increasing, especially among younger motorists (this data comes from a new survey conducted by NHTSA). More than half of all drivers believe that using a cell phone and or sending a text message/e-mail makes no difference on their driving performance, yet as passengers, 90% said they would feel very unsafe if their driver was talking on a handheld cell phone or texting/e-mailing while traveling with them.

That is the bias blind spot at work.

Sadly, recent evidence suggests that being smarter, more aware or more educated doesn’t seem to help us deal with these cognitive difficulties more effectively.  Indeed, they may actually make things worse.  A new study suggests that, in many instances, smarter people are more vulnerable to thinking errors, even basic ones.  Moreover, “people who were aware of their own biases were not better able to overcome them.”

As Jonah Lehrer points out in The New Yorker, the driving forces behind our cognitive biases—the root causes of our irrationality—are largely unconscious, which means they remain invisible to self-analysis and impermeable to intelligence or education. And as Tauriq Moosa argues, we “cannot recognise the biases and blunders, due to a deep, complex layer of justification [we]’ve narrated to themselves.” The problem is even more acute when the “answer” is counter-intuitive, and good investing is often wildly counter-intuitive (it’s really hard to sell when we’re euphoric or buy when we’re terrified).

These biases make it extremely difficult to see how badly we screw up.  We need (relative) objectivity and humility if we are going to succeed in investing and in life unless we are extremely lucky. Having an accountability partner or (better yet) a competent and empowered team is particularly important due to our great ability to spot what’s wrong with everybody else. So is a clear (and clearly defined) process that demands constant re-evaluation.

It’s a lead-pipe-lock that we’re going to err and err often in the investment world.  Our inherent biases ensure it.  If we are to succeed, we need carefully crafted plans with screw-up contingencies built-in together with a commitment to regular re-evaluation and a rescue plan in the event of major catastrophe.  We need to start by assuming that we have made errors and set out actively to find them by testing and confirming everything possible. Otherwise, we are not likely to succeed over the long-term.  Planning to be lucky and believing that these psychological realities don’t apply to us is a lovely (if arrogant) thought.  But it’s not remotely realistic.