Rock You Like a Superstorm

 

Hurricane/Superstorm Sandy rocked the eastern seaboard last week to devastating effect.  In a significant instance of good planning, markets and schools were closed, states of emergency declared and mandatory evacuations begun well before the storm made landfall.  Yet nearly until the storm reached land in New Jersey last Monday, I heard lots of grousing about alleged hysteria and overreaction with respect to the precautions and preparations being undertaken to mitigate potential damage (see below for a prominent example).   

Some went so far as to defy evacuation orders, and some people paid for doing so with their lives.  Once the storm actually hit and caused serious damage – albeit no longer officially as a hurricane, but as a “superstorm” – the complaining stopped.  Fortunately, the governmental disaster preparedness organization seems to have performed well overall.  You can read about these events in many venues, including herehere and here.

The pre-crisis grousing and the refusal of so many to evacuate are worth thinking about because of what is thereby revealed about us as humans and the cognitive biases that beset us.  I offer three “take-away” thoughts that are broadly applicable as well as specifically applicable to the investment world.

1. We don’t deal well with probabilities.  When a weather forecast says that there is a 70 percent chance of sun, we tend to think that the forecaster screwed up if it rains.  But that’s not how we should evaluate probabilities.  Instead, we should consider how often it rains when the forecast calls for a 70 percent chance of sun.  When the forecast is spot-on perfect, it will rain 30 percent of the time when it calls for a 70 percent chance of sun.  The odds favor sun, but because complex systems like the weather (and financial markets) encompass so many variables, nothing approaching certainty is possible.  We don’t handle that kind of thinking very well (a very current and interesting example in a political context is examined here).

To illustrate the level of complexity I’m talking about, consider that we can construct a linear, one-dimensional chain with 10 different links in 3,628,800 different ways.  For 100 different links, the possibilities total 10158. If those are the possibilities for making a simple chain, imagine the possibilities when we’re talking about complex systems where wild randomness rules. 

Perhaps the key argument of Nobel laureate Daniel Kahneman’s brilliant book, Thinking Fast and Slow, is that without careful and intentional deliberation (and often even then), we suffer from probabilistic irrationality. Remember back in 2009 when New England Patriots coach (and my former New Jersey neighbor) Bill Belichick famously decided to go for a first down on fourth-and-two in Patriots territory rather than punt while up six points late against Peyton Manning and the Indianapolis Colts?  When Wes Welker was stopped just short of the first down and the Colts went on to score the winning touchdown, the criticism was overwhelming even though Belichick’s decision gave the Pats a better chance of winning. Those withering attacks simply demonstrate our difficulties with probabilities. Doing what offers the best chance of success in no way guarantees success. As analyst Bill Barnwell, who was agnostic on whether Belichick was right or wrong, wrote: “you can’t judge Belichick’s decision by the fact that it didn’t work” (bold and italics in the original). We can (and should) hope for the best while preparing for the worst.

The world is wildly random.  With so many variables, even the best process (when we are able to overcome our probabilistic irrationality) can be undermined at many points, a significant number of which are utterly out of anyone’s control.   As Nate Silver reports in his fine new book, The Signal and the Noise, the National Weather Service is extremely good at weather forecasting in a probabilistic sense. When the NWS says there is a 70 percent chance of sun, it’s sunny just about 70 percent of the time.  Because we don’t think probabilistically (and crave certainty too), we tend to assume that the forecasts on the days it rains – 30 percent of the time – are wrong.  Accordingly, when a probabilistic forecast of a dangerous hurricane is generally inconsistent with our experience (“I didn’t have a problem last time”) and isn’t what we want to hear (think confirmation bias), we can readily focus on the times we remember weather forecasts being “wrong” and discount the threat.  As mathematician John Allen Paulos tweeted regarding the trouble that so many seem to have election probabilities:

Many people’s notion of probability is so impoverished that it admits of only two values: 50-50 and 99%, tossup or essentially certain.

In a fascinating research study, economists Emre Soyer and Robin Hogarth showed the results of a regression analysis to a test population of economics professors. When they presented the results in the way most commonly done in economics journals (as a single number accompanied by some error measures), the economists — whose careers are largely predicated upon doing just this sort of analysis! — did an embarrassingly poor job of answering a set of questions about the probabilities of various outcomes. When they presented the results as a scatter graph, the economists got most of the questions right. Yet when they presented the results both ways, the economists got most of the questions wrong again. As Justin Fox emphasizes, there seems to be something about a single-number probability assessment that lures our primitive brains in and leads them astray.

Due to complexity and the wild randomness it entails, the investment world — like weather forecasting — offers nothing like certainty.  As every black jack player recognizes, making the “right” play (probabilistically) in does not ensure success.  The very best we can hope for is favorable odds and that over a long enough period those odds will play out (and even then only after careful research to establish the odds).  That we don’t deal well with probabilities makes a difficult situation far, far worse.

2. We’re prone to recency bias too.  We are all prone to recency bias, meaning that we tend to extrapolate recent events into the future indefinitely.  Since the recent experience of residents of the eastern seaboard (Hurricane Irene) wasn’t nearly as bad as expected (despite doing significant damage), that experience was extrapolated to the present by many.  When confirmation bias (we tend to see what we want to see) and optimism bias are added to the mix, it’s no wonder so many didn’t evaluate storm risk (and don’t evaluate investment risk) very well.

3. We don’t deal well with low probability, high impact events.  In the aggregate, hurricanes are low-frequency but high impact events.  As I have explained before, when people calculate the risk of hurricane damage and make decisions about hurricane insurance, they consistently misread their prior experience. This conclusion comes from a paper by Wharton Professor Robert Meyer that describes and reports on a research simulation in which participants were instructed that they were owners of properties in a hurricane-prone coastal area and were given monetary incentives to make smart choices about (a) when and whether to buy insurance against hurricane losses and (b) how much insurance to buy.

Over the course of the study (three simulated hurricane “seasons”), participants would periodically watch a map that showed whether a hurricane was building as well as its strength and course. Until virtually the last second before the storm was shown to reach landfall, the participants could purchase partial insurance ($100 per 10 percent of protection, up to 50 percent) or full coverage ($2,500) on the $50,000 home they were said to own. Participants were advised how much damage each storm was likely to cause and, afterward, the financial consequences of their choices. They had an unlimited budget to buy insurance.  Those who made the soundest financial decisions were eligible for a prize.

The focus of the research was to determine whether there are “inherent limits to our ability to learn from experience about the value of protection against low-probability, high-consequence events.”  In other words — whether experience can help us deal with tail risk. Sadly, we don’t deal with this type of risk management very well. Moreover, as Nassim Taleb has shown, such risks — while still not anything like frequent — happen much more often than we tend to think (which explains why the 2008-09 financial crisis was deemed so highly unlikely by the vast majority of experts and their models). 

The bottom line here is that participants seriously under-protected their homes. The first year, they sustained losses almost three times higher than if they had bought protection rationally. The key problem was a consistent failure to buy protection or enough protection even when serious and imminent risk was obvious (sounds like people refusing to evacuate, doesn’t it?).  Moreover, most people reduced the amount of protection they bought whenever they endured no damage in the previous round, even if that lack of damage was specifically the result of having bought insurance.

Experience helped a little.  Participants got better at the game as season one progressed, but they slipped back into old habits when season two began. By season three, these simulated homeowners were still suffering about twice as much damage as they should have.  As Meyer’s paper reports, these research results are consistent with patterns seen in actual practice. For example, the year after Hurricane Katrina there was a 53% increase in new flood-insurance policies issued nationally.  But within two years, cancellations had brought the coverage level down to pre-Katrina levels.

We simply don’t do a very good job dealing with low-probability, high-impact events, even when we have experience with them.  Since those in the northeast have so little experience with hurricanes, their discounting of hurricane risk is (again) even more understandable.  Given what happened to the vast majority of investment portfolios in 2008-09, the alleged market “professionals” often don’t manage tail risk very well either. That said, when a low-frequency event is treated as a certainty or near-certainty as a matter of policy, that overreaction can be disastrous and the costs too high to bear, as a Navy SEAL Commander here in San Diego once took great pains to explain to me in the context of fighting terrorism.

Taleb goes so far as to assert that we should “ban the use of probability.”  I disagree, but we ought to use probabilities with care and be particularly careful about how we convey probability assessments.  For example, a potential range of outcomes is better than a single number (as with the scatter graphs noted above).  Similarly, an outlook that shows the weighing of probabilities together with costs and potential outcomes will also help (this discussion makes a start in that direction).  Despite the risks of being perceived as “crying wolf,” we intuitively understand that when and as the potential negative outcomes are greater, lower likelihood events should generally be treated more seriously and that the progression is typically non-linear.   

In virtually every endeavor, our cognitive biases are a consistent problem and provide a constant challenge.  In terms of investing, they can and often do rock us like a hurricane — or at least a superstorm.  As Cullen Roche points out, consistent with the research noted above, we can and should learn from our investment errors, cognitive or otherwise.  Sadly, we do so far less often than we ought, as last week’s events amply demonstrate.

Advertisement

2 thoughts on “Rock You Like a Superstorm

  1. Pingback: Consider All the Possibilities | Above the Market

  2. Pingback: Data Beats Your Lyin’ Eyes | Above the Market

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s