Barry Ritholtz has another fine “Philosophy Phriday” piece up today on the value of not knowing. His emphasis is on recognizing what you don’t (and perhaps can’t) know so as to minimize error. He’s right of course. This idea reminds me of the following classic commercial.
I have used this delightful video before to make the point that if we are going to get anything worthwhile out of each other’s thinking, we need to be speaking the same language — both literally and figuratively. Of course, that happens all too infrequently in our business. Indeed, we can’t often agree on even the foundations of our would-be discussions.
We are driven by our ideologies and our behavioral biases rather than facts and data. Even worse, we refuse to acknowledge (much less accept) facts and data that militate against our preconceived notions. We even suck at math and at analyzing probabilities, often causing our analytical efforts to be suspect or just plain wrong. Our investment processes — the key to investment success — are generally poor.
But that’s not the only message to glean from the video and from Barry’s insight. The connected but broader message is that we always know less — typically a lot less — than we think we do. The German coast guard officer thinks he understands what’s going down (figuratively) even though he is dangerously clueless about what is (literally) going down. Barry uses the example of not knowing where the market will be in a year even though, by his estimate (and I agree), roughly 4/5 of our industry will claim to be able to make such a forecast. As Barry appropriately points out, that hubris misleads consumers into thinking that the alleged experts know what consumers don’t so they should buy what the expert is selling.
We know a lot less than we think in part because of our cognitive and behavioral biases. These biases skew our judgment and thus conspire to prevent us from knowing what we think we know. But we also know a lot less about the markets than we tend to think because investing is so hard and is influenced by so many factors. It’s impossible to keep up.
In what is now a ubiquitous concept, a “black swan” is an extreme event that lies beyond the realm of our normal expectations and which has enormous consequences (i.e., Donald Rumsfeld’s “unknown unknowns”). It is by definition an outlier. Examples include the rise of Hitler, winning the lottery, the fall of the Berlin Wall and the ultimate demise of the Soviet bloc, the development of Viagra (which was originally designed to treat hypertension before a surprising side effect was discovered) and of course the 9.11 atrocities.
Unfortunately, while today’s complex systems are generally quite good at dealing with anticipated forms of uncertainty and disruption (Rumsfeld’s “known unknowns”), which makes them highly efficient, it is the unanticipated “unknown unknowns” that are so vexing and problematic. The worst crises happen when and where we don’t expect them, strike at the heart of a system, and are exacerbated by our preexisting conviction that they can’t or won’t happen. Thus the Lehman Brothers collapse wasn’t a problem of being too big to fail, but rather a function of being too central to fail without enormous cascading impacts. Moreover, its risk models were wildly inadequate yet considered utterly reliable — a classic unknown unknown made exponentially worse by a numbing failure to consider that much less was known than was thought.
We joke in our family that my motto is Often wrong; never in doubt. Here’s to hoping that the expression only relates to my presentation style. In reality, we should all be in doubt far more than we are. If we doubted more, we’d surely be wrong less.