We crave certainty.
As reported by Harvard’s Daniel Gilbert on the Happy Days blog at nytimes.com, Maastricht University researchers gave (volunteer) subjects a series of 20 electric shocks. Some subjects were told that they would receive an intense shock every time while others were told that they would receive 17 mild shocks and 3 intense ones, but that they wouldn’t know on which of the 20 the intense shocks would come. The study showed that subjects who thought there was a small chance of receiving an intense shock were more afraid — they sweated more profusely, their hearts beat faster — than those who knew for sure that they’d receive an intense shock. Interestingly, that’s because people feel worse when something bad might occur than when something bad will occur — they find uncertainty more painful than the things they’re uncertain about.
Why do people seem to prefer to know the worst rather than merely to suspect it? According to Gilbert, that’s probably because when most of us get bad news we cry for a bit and then get busy making the best of things. We change our behaviors and we change our attitudes. We raise our attentiveness and lower our expectations. We find our bootstraps and pull (pretty hard if necessary). But we can’t come to terms with circumstances whose terms we don’t yet know. An uncertain future leaves us stranded in an unhappy present with nothing to do but wait.
We all respond positively to increased certainty in our lives (including in financial outcomes) — even after a major shock and when that certainty limits our prospective gain. In these highly uncertain times, increased certainty can be a highly valuable commodity. Unfortunately, our level of certainty – desired though it is – is not well correlated to the facts.
The day after the space shuttle Challenger exploded in 1986, Cornell psychology professor Ulric Neisser (who died last month at 83) had his students write precisely where they’d been when they heard about the disaster. Nearly three years later, he asked them to recount it again. A quarter of the accounts were strikingly different, half were somewhat different, and less than a tenth had all the details correct. Yet all were confident that their most recent accounts were completely accurate. Indeed, many couldn’t be dissuaded even after seeing their original notes. One of them even asserted, “That’s my handwriting, but that’s not what happened.”
For neurologist Robert Burton, the Neisser study is emblematic of an essential quality of who we are. In his brilliant book, On Being Certain, Burton systematically and convincingly shows that certainty is a mental state, a feeling like anger or pride that can help guide us, but that doesn’t dependably reflect anything like objective truth. One disconcerting finding he describes is that, from a neurocognitive point of view, our feelings of certainty about things we’re right about is largely indistinguishable from our feelings of certainty about things we’re wrong about.
Such unwarranted certainty is consistent with our tendency (discussed earlier this week here and here) to build our ideologies first and then to construct narratives to support those ideologies, with facts and data only sought out to undergird our pre-conceived notions after the fact and subjectively “analyzed” only in that light. It also suggests why we can be so uncomfortable with the necessarily inductive process of scientific inquiry. We’d much prefer the certainty of deductive logic. Sadly, much that claims to be “research” in the financial world is nothing of the sort – it is ideology (or sales literature) in disguise (and not very well disguised at that). Even so, perceived certainty gives us the confidence we need to make decisions and to establish trust and credibility with others. It’s an ironic feedback loop of sorts.
Good science requires the careful and objective collection of data with any interpretations and conclusions drawn therefrom being tentative and provisional, and of course subject to any subsequent findings. But that’s not what often happens, especially in the financial world. As Columbia’s Rama Cont points out, “[w]hen I first became interested in economics, I was surprised by the deductive, rather than inductive, approach of many economists.” In the hard sciences, researchers tend to observe empirical data and then build a theory to explain their observations, while “many economic studies typically start with a theory and eventually attempt to fit the data to their model.” As noted by Emanuel Derman:
In physics it’s fairly easy to tell the crackpots from the experts by the content of their writings, without having to know their academic pedigrees. In finance it’s not easy at all. Sometimes it looks as though anything goes.
I suspect that these leaps of ideological fancy are a natural result of our constant search for meaning in an environment where noise is everywhere and signal vanishingly difficult to detect. We are meaning-makers at every level and in nearly every situation. Yet, as I have noted before, information is cheap and meaning is expensive. Therefore, we tend to short-circuit good process to get to the end result – typically and not so coincidentally the result we wanted all along.
Science progresses not via verification (which can only be inferred) but by falsification (which, if established and itself verified, provides certainty as to what is not true). Thank you, Karl Popper. In our business, as in science generally, we need to build our investment processes from the ground up, with hypotheses offered only after a careful analysis of all relevant facts and tentatively held only to the extent the facts and data allow. Yet the markets demand action. There is nothing tentative about them. That’s the conundrum we face.
The scientific process cannot offer meaning and can only suggest interpretation. Near the end of her wonderful novel, Housekeeping, Pulitzer Prize winner (for the equally wonderful Gilead) Marilynne Robinson notes that ”[f]act explains nothing. On the contrary, it is fact that requires explanation.” This is a telling observation and one those who are overly enamored with the scientific process are prone to ignore or forget. Science is a fabulous tool – the best we have – but also merely a tool. It is not a be-all nor is it an end-all. Derman again: “[d]ata alone doesn’t tell you anything, it carries no message.” Brute fact requires both meaning and context in order to approach anything like truth or understanding. But meaning is increasingly difficult to find in a world and with respect to markets that demand definitive answers (or at least definitive decisions) immediately.
I’m certain of it.