Reckoning with Risk (6): 9.11 Edition

In what is now a ubiquitous concept, a “black swan” is an extreme event that lies beyond the realm of our normal expectations and which has enormous consequences (e.g., Donald Rumsfeld’s “unknown unknowns”). It is by definition an outlier.  Examples include the rise of Hitler, winning the lottery, the fall of the Berlin Wall and the ultimate demise of the Soviet bloc, the development of Viagra (which was originally designed to treat hypertension before a surprising side effect was discovered) and of course the 9.11 atrocities. 

As Nassim Taleb famously pointed out in his terrific book outlining the idea, most people (at least in the northern hemisphere) expect all swans to be white because that is consistent with their personal experience.  Thus a black swan (native to Australia) is necessarily a surprise. Yet, once discovered, we tend to concoct explanations for black swans which make them appear more predictable and less random than they actually are. This tendency is called the “narrative fallacy.” Our minds are designed to retain, for efficient storage, past information that fits into a compressed narrative. This distortion, the “hindsight bias,” prevents us from adequately learning from what has gone on before.

Black swans also have extreme effects, both positive and negative.  Even though I think that Taleb overstates their overall significance somewhat, just a few explain a surprising amount of our history, from the success of some ideas and discoveries to events in our personal lives. Moreover, their influence seems to have grown beginning in the 20th century (on account of globalization and growing interconnectedness), while ordinary events — the ones we typically study, discuss and learn about in history books or from the news — seem increasingly inconsequential.  A fascinating discussion of these ideas within the context of the 2008-09 financial crisis between Taleb and Nobel laureate Daniel Kahneman is available on video here

Higher levels of complexity lead to systems that are increasingly fragile and susceptible to sudden, spectacular collapse. Indeed, John Casti’s X-Events argues that today’s highly advanced and overly complex societies have grown highly vulnerable to extreme events that may ultimately result in the collapse of our civilization.  Examples could include a global internet or technological collapse, transnational economic meltdown or even robot uprisings. 

Per Andrew Zolli in the Harvard Business Review, CalTech system scientist John Doyle calls such systems Robust-Yet-Fragile.  While they are good at dealing with anticipated threats, they are quite poor at dealing with unanticipated ones.  Accordingly, as the complexity of these systems grows, both the sources and severity of possible disruptions increases.  Meanwhile, the size required for potential ‘triggering events’ decreases.  Thus it may only take a tiny event, at the wrong place or at the wrong time, to spark a calamity.  While the chances of any of these possibilities actually happening is individually remote, our general susceptibility to that type of catastrophe is surprisingly real.

Taleb applies these concepts to the investment world by trying to be extremely risk-averse where the risks are high and the potential gains are small and extremely aggressive where the costs are low and the potential gains are high.  He thus worries less about small failures and more about large, potentially terminal ones. He worries more about conventional investments and less about the truly speculative ones.  In essence, Taleb wants to gain exposure to positive Black Swans — when a failure would be of small moment — and avoid those situations where he is under threat from a negative Black Swan. Taleb’s forthcoming book will seek to examine these ideas in more detail.

To be sure, it can be hard to distinguish between black swan “positioning” and long-range forecasting.  Current long-term investment approaches focusing on food, farmland and timber probably fall into this category.  Moreover, any number of potential extreme events are simply too extreme to deal with in more than a rudimentary way.  For example, if the entire economic system melts down, it won’t likely matter how short the market you are at the time.  Even so, there are lessons to be learned and actions that can be considered and taken to mitigate these (seemingly increasing) risks.  

Some tentative conclusions follow.

  1. Minimize downside risk exposure efficiently to the extent possible.  That may mean buying puts and/or other types of insurance.  It may mean keeping a cash cushion.  It may also mean reducing one’s reliance on a key supplier despite significant added cost.  The “efficiently” qualifier simply means that we should be careful how much we pay for protection.  Taleb argues persuasively that long-term protection from tail risk is often underpriced.  But our general risk aversion can readily push us to pay too much to avoid certain or general risks.
  2. Plan, test, evaluate, adjust and plan some more (and frequently), both near, intermediate and long-term.  The inherent difficulty in planning – our dreadful track record in trying to make predictions about the future – has always been with us, but good planning remains good business (investment or otherwise).  In a highly uncertain environment, that means that we should be using scenario planning.  Since no one base case or even one set of cases can be regarded as highly probable or comprehensive much of the time, it is necessary to develop robust plans based upon the assumption that multiple futures are possible and therefore to focus attention on the underlying drivers of uncertainty. 
  3. Develop a clear and multi-sourced pipeline of pertinent information flow.  This approach should include obtaining and evaluating information and ideas from sources with different objectives and outlooks.  Within organizations that means trying to foster what Kahneman calls “adversarial collaboration” and making sure that everyone can be challenged without fear of reprisal and that everyone (and especially anyone in charge) is accountable.
  4. Stay flexible in terms of outlook, approach and action.  It’s easy to get caught up in ideological thinking rather than data-driven analysisConfirmation bias makes this problem much worse.  As I have noted repeatedly, we like to think that we’re like judges, carefully evaluating the facts before coming to a rational and impartial decision.  Instead, we’re much more like attorneys, searching for alleged facts and arguments that support our preconceived positions and ideas.  It is crucial that one remain willing, based upon sufficient data, to change course, perhaps quickly (both in terms of emergency response and in terms of fixing the problem). It also means encouraging innovation at every level of an organization, including innovative and (pardon the cliché) outside-the-box thinking (because we all tend toward tunnel vision).  Finally, it means dealing with the psychological impact of being shocked by what may be an inconceivable event of staggering proportions.  People who are that wrong often have trouble adjusting to a new reality.  It’s one reason we’re lousy at dealing with new and different situations generally but really good at gearing up to fight the last war.
  5. Don’t get lost in the details, suffering from what Taleb calls the error of excessive and naïve specificity. Future black swans are necessarily abstract and elusive.  Therefore, detailed knowledge of previous black swans isn’t likely to help much except in broad generalities.
  6. Maintain adequate alert systems and contingency plans.  Even great planning will often be wrong.  But how those failings are dealt with is crucial to future success.
  7. Being surprised by a true black swan is not the same thing as being surprised in general.  The 2008-09 financial crisis was not a black swan – it was readily foreseeable, even though the timing of it was not.  Whenever we screw up we want to claim “I couldn’t have known.  It wasn’t my fault.”  But if we’re going to get better, we’re going to have to “read the signs” more carefully and more effectively and we’re also going to have to do more and better planning for real black swans.

The full series on risk is available here.


10 thoughts on “Reckoning with Risk (6): 9.11 Edition

  1. Pingback: Reckoning with Risk (the Series) | Above the Market

  2. Pingback: September 11: In Memoriam | Above the Market

  3. Pingback: Reckoning with Risk (7): Widening Your Lens | Above the Market

  4. Pingback: Money for Nothing | Above the Market

  5. Pingback: Complexity Risk Management — a lot like Jazz | Above the Market

  6. Pingback: Reckoning with Risk (the Series) | Above the Market

  7. Pingback: Leverage | Above the Market

  8. Pingback: The Greatest Risk of All | Above the Market

  9. Pingback: Reckoning with Risk (the Series) | Above the Market

  10. Pingback: September 11: In Memoriam | Above the Market

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s