My series on risk has sought to outline and categorize many of the risks faced by investors. As I have tried carefully to point out, risk is difficult if not impossible to define fully, much less quantify in any comprehensive way. It is surely not the same thing as volatility. Indeed, taking no risk or too little risk are huge risks, albeit of different sorts. But one thing we know for sure is that risk is risky.
Yet to this point in the series I have only alluded to the greatest risk of all. In the immortal words of Pogo, we have met the enemy and he is us.
Fortunately, behavioral economics has done a terrific job at the beginnings of an outline as to what these behavioral and cognitive risks look like. My post entitled Investors’ 10 Most Common Behavioral Biases from back in July describes 10 key problem areas (and has consistently been the most popular post overall on this site).
Unfortunately for us, #10 on that list — the bias blind spot – is not just true, it’s really true, reeking of truthiness, true in spades. It is our overarching problem.
We all tend to share this foible — the inability to recognize that we suffer from the same cognitive distortions and behavioral biases that plague other people. As one prominent piece of research puts it:
We cannot attribute [our adversaries'] responses to the nature of the events or issues that elicited them because we deem our own different responses to be the ones dictated by the objective nature of those events or issues. Instead …. we infer that the source of their responses must be something about them.
In other words, if we believe something to be true, we quite naturally assume those who disagree have some sort of problem. Our beliefs are deemed merely to reflect the objective facts because we think they are true. Duh. Our thought process goes something like this:
I’ve thought long and hard about it [biases leave no cognitive trace, after all] and I’m convinced I’m not a bigot. Some of my best friends are __________.
Of course, that line of thinking doesn’t convince anybody else. The research again:
We are not particularly comforted when others assure us that they have looked into their own hearts and minds and concluded that they have been fair and objective.
Of course not — they’re biased (but I’m not). It’s the same kind of thinking that allows us to smile knowingly when friends tells us about how smart, talented and attractive their children are while remaining utterly convinced as to the objective truth of the amazing attributes of our own kids.
We can only hope to deal with the bias blind spot by constantly remaining on the look-out for it. I suggest routinely consulting people with whom you disagree. Spouses can be particularly helpful here. They will, almost surely, be able to point out your biases and other faults with perfect clarity.
The existence of behavioral minefields is difficult enough. That we don’t think they apply to us is often fatal to our judgment.
So what are these biases? A set of reminders follow.
Confirmation bias means that instead of the impartial judges of information that we like to think we are, we’re much more like attorneys looking for any argument we think we can exploit without much regard for whether it’s true. It’s why Fox News and MSNBC viewers tend to see each other as some version of stupid, delusional and evil, no matter the situation or circumstances.
Optimism bias means that one’s subjective confidence in his judgment is reliably greater than his objective accuracy – we think we’re right far more often than we are. It’s why (together with confirmation bias) Little League bleachers all over America are full of parents just sure that Johnny-boy is a future Major Leaguer (or a least a prospective college scholarship winner — all evidence to the contrary) and why venture capitalists are wildly overconfident in their estimations of how likely their potential ventures are to succeed.
Our self-serving bias pushes us to see the world such that the good stuff that happens is my doing (like the coach who says “We had a great week of practice, worked hard and executed today”) while the bad stuff is always someone else’s fault (“It just wasn’t our night” or “We would have won if the refereeing hadn’t been so awful” or “We couldn’t have foreseen that 100-year flood market crisis”).
Our loss aversion means that we feel losses between two and two-and-a-half times as strongly as gains. It favors inaction over action and the status quo over any alternative. It’s one reason why football coaches are so frustratingly cautious and “go for it” far less often than the data says they should.
The planning fallacy is our tendency to underestimate the time, costs, and risks of future actions and at the same time to overestimate the benefits thereof. It’s why we overrate our own capacities and exaggerate our abilities to shape the future. It’s one reason why every building project tends to have cost overruns and why my week-end chores take at least twice as long as I expect and require three trips to Home Depot.
Intuitively, we think the more choices we have the better. However, the sad truth is that too many choices can lead to decision paralysis due to information overload. It’s why participation in 401(k) plans among employees decreases as the number of investable funds offered increases and why New Jersey diner menus (this one, for example) can be so frustrating. It’s also why we so readily rely upon heuristics (rules of thumb) rather than getting down and dirty with the data.
We routinely run in herds — large or small, bullish or bearish. Investment institutions herd even more than individuals in that investments chosen by one institution predict the investment choices of other institutions by a remarkable degree. It’s why market bubbles occur – in tulips, baseball cards, internet stocks and real estate. It’s why the NFL is a copycat league.
We inherently prefer narrative to data — often to the detriment of our understanding – even though stories are crucial to how we make sense of reality. That’s why ridiculous conspiracy theories abound, even among otherwise smart and intelligent people, without a bit of good evidence. It’s also why we will tend to distrust data unless and until a good story is attached to it.
We are all prone to recency bias, meaning that we tend to extrapolate recent events into the future indefinitely. That’s why most NFL pre-season play-off predictions look like the previous year’s play-off pool even though there is typically a 50 percent change-over in play-off teams year-to-year. And as reported by Bespoke, Bloomberg surveys market strategists on a weekly basis and asks for their recommended portfolio weightings of stocks, bonds and cash. Even though they are the supposed “experts,” their collective views are a great contra-indicator. The peak recommended stock weighting came just after the peak of the internet bubble in early 2001 while the lowest recommended weighting came just after the lows of the financial crisis. That’s recency bias.
Again, the existence of these behavioral and cognitive deficiencies is difficult enough. That we tend to think they’re other people’s problems and not our own – the bias blind spot – is the icing on the cake. As I have tried to point out repeatedly, risk is risky. Because of our behavioral biases and our tendency to think they don’t apply to us, the odds are overwhelming that we’re going to miss or ignore many of the risks that plague us. The greatest risk of all is staring us in the face when we look into the mirror. At least, the greatest risk of all is staring you in the face when you look into the mirror.
My series on risk is available at these links: