As the saying goes (often falsely attributed to Lenin), there are decades where nothing happens, and there are weeks where decades happen. We just lived through one of those decade-weeks. It “was a week of escalation. Of the coronavirus and the countermeasures. Of the economic hit and the policy response. Of investor fear and market stress. In the space of six days, warning lights flashed in the deepest recesses of global markets, a decade’s worth of monetary action, and a scramble for liquid assets surpassing any that went before.” Continue reading
We all suffer from the cognitive and behavioral biases I have been highlighting in this series. Lots more, too. We’re often “dumb, panicky, dangerous animals.” These biases threaten any hope we might have of objective analysis, especially about the things that are closest to us and in which we have the most invested.
We don’t perceive the world nearly as well as we think we do.
Most especially, we aren’t as self-aware as we think we are.
If we are aware at all, we will frequently recognize these behavioral and cognitive weaknesses in others – especially the most egregious examples. But we will almost never recognize them in ourselves. That’s because everybody else is expressing opinions while we are stating facts. Or so it seems.
That reality – that failing – is bias blindness, our inability or unwillingness, even if and when we see it in others, to see the biases that beset us. Bias is everywhere. So is bias blindness, no matter how willing – and even eager – we are to deny it. As Jesus said: “It’s easy to see a smudge on your neighbor’s face and be oblivious to the ugly sneer on your own.”
Bias blindness is the most significant bias of all.
As the Joker says, “Sometimes I remember it one way, sometimes another. If I’m going to have a past, I prefer it to be multiple choice.”
In the “Author’s Message” to his thriller, State of Fear, in which the hero scientist questions the global scientific consensus on climate change, the late Michael Crichton made the point that “politicized science is dangerous,” and then added, “Everybody has an agenda. Except me.”
From a large and representative sample, more than 85 percent of test respondents believed they were less biased than the average American. Another study of those who were sure of their better-than-average status found that they “insisted that their self-assessments were accurate and objective even after reading a description of how they could have been affected by the relevant bias.” On the other hand, participants reported their peers’ self-serving attributions regarding test performance to be biased while their own similarly self-serving attributions were free of that bias.
We are emotional more than rational. Our beliefs, preferences, and choices can and do change, often for poor reasons, and which choices often foreclose or limit later choices. Finally and crucially, these weaknesses are mostly opaque to us. They leave no cognitive trace.
We think that reality only applies to somebody else. We’re often wrong but never in doubt.
When Jane Curtin was asked if the person she was mimicking for a screen role knew that she was the source material, she replied, “I used to do my aunt when I was doing improv, and she always thought I was doing my other aunt.”
George Washington was well aware of his bias blindness, as reflected by his famous Farewell Address, yet another reason for his greatness.
“Though, in reviewing the incidents of my administration, I am unconscious of intentional error, I am nevertheless too sensible of my defects not to think it probable that I may have committed many errors. Whatever they may be, I fervently beseech the Almighty to avert or mitigate the evils to which they may tend. I shall also carry with me the hope that my country will never cease to view them with indulgence; and that, after forty-five years of my life dedicated to its service with an upright zeal, the faults of incompetent abilities will be consigned to oblivion, as myself must soon be to the mansions of rest.”
Check out Lin-Manuel Miranda’s brilliant musical version from Hamilton. It’s magic.
Warren Buffett put it really well. “What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.” And who better to illustrate it than Dr. Sheldon Cooper?
“Howard, you know me to be a very smart man.
Don’t you think if I were wrong, I’d know it?”
On our better days, we might grudgingly concede that we hold views that are wrong. The problem is in providing current examples.
A key theme in Shakespeare, for example, shows everyone thinking that they are smart enough to fool others, all the while being fools themselves.
That may explain why people on the freeway driving slower than I are dangerous idiots while people driving faster are…dangerous maniacs.
And why “everyone is stupid except me.”
Roughly to paraphrase the Swiss theologian Karl Barth, Hell is being apart from God. As C.S. Lewis wrote in The Great Divorce, “There are only two kinds of people in the end: those who say to God, ‘Thy will be done,’ and those to whom God says, in the end, ‘Thy will be done.’” In that way, Hell is having your own way and being stuck with it. If we don’t find ways at least to mitigate our mental weaknesses and shortcomings, we will be stuck in Hell. We will have our own way, sure, but we’ll continue to be stuck with it.
Our own way is inevitably human. Unfailingly and frustratingly human.
Naturally the dying man wonders to himself
Has commentary been more lucid than anybody else?
And had he successively beaten back the rising tide
Of idiots, dilettantes, and fools
On his watch while he was alive
And it occurs to him a little late in the game
We leave as clueless as we came
For the rented heavens to the shadows in the cave
We’ll all be wrong someday
Bias, like wisdom and wealth, compounds, making “our own way” particularly excruciating. We each have 525,600 minutes per year to get things right…
…or at least righter; or even better, less wrong. Overall, things are bad enough that, usually, not stupid wins.
Fixing a problem begins with understanding there is a problem. We humans can be remarkably yet wrongly sure of our own rightness and righteousness, no matter what others might think or what is going on around us. Note the following, terrifying example.
If nothing else, I hope this series of illustrations has caused you to consider that you might not be as aware, as great, or as unbiased as you tend to think. I trust it has provided at least a bit of illumination of the bias problems that so routinely beset all of us.
We’re often wrong, but never in doubt.
By every objective measure, Joe Flacco is a decent, but not great, NFL quarterback. However, he parlayed one great playoff run into a Super Bowl championship and a huge contract. ESPN’s Merrill Hoge decided that one great run over a very small sample size that included a Super Bowl turned Joe Flacco into the best quarterback in the NFL.
That’s recency bias – our tendency to overweight and overemphasize what just happened over the broader body of evidence. Oh, and by the way, after that great run, Flacco went back to being okay but not great.
Its close cousin is the availability heuristic, whereby we tend to think that that which is most readily recalled provides the best context and basis for future decisions. Its why we’re generally far more afraid of sharks, responsible for six deaths per year worldwide, than mosquitos, which are responsible for 750,000 deaths per year.
In fact, as explained by Freeman Dyson, sharks actually save the lives of swimmers. On average, each swimmer killed by a shark saves the lives of ten others by keeping people out of the water: “Every time a swimmer is killed, the number of deaths by drowning goes down for a few years and then returns to the normal level. The effect occurs because reports of death by shark attack are remembered more vividly than reports of drownings.”
Per William James, “The attention which we lend to an experience is proportional to its vivid or interesting character; and it is a notorious fact that what interests us most vividly at the time is, other things equal, what we remember best.” Think name recognition in politics.
In the investment world, we’re especially susceptible to these biases.
It’s why we purchase defensive investment products and strategies right after the market crashes. We lock the barn door after the horse is gone.
As the famous proverb goes, that’s why we are so prone to “fight the last war.” An historical example is when France built the Maginot Line in the 1930s. It was a series of concrete fortifications constructed along the border with Germany; the French didn’t bother to fortify their border with Belgium. The Maginot Line would have been a great move in WWI (then the last war), where armies moved slowly and concrete fortifications were significant hindrances. However, it didn’t help in WWII, when Germany flanked the Maginot Line and invaded from the north, through Belgium.
We all tend to place too much emphasis on the recent and latch onto what’s immediate…like the girl next door.
We often lose sight of the longer-term because of our obsession with the vividly immediate. Lunch tomorrow is a long-range plan. We routinely do without “lasting treasure…”
…to fixate on “a moment’s pleasure” even though we know – unequivocally – that focusing on the long-term is the right thing to do, in our lives and in our businesses.
Hindsight bias occurs when people feel that they “knew it all along,” that is, when they believe that an event is more predictable after it becomes known than it was before it became known.
It is closely related to outcome bias, whereby we evaluate a decision by the outcome it generates.
In 2009, with his team leading 34-28 late but backed up at their own 28-yard-line with two minutes left and needing two yards for a first down, New England Patriots’ coach Bill Belichick chose to go for it on fourth down to try and keep the ball out of opposing quarterback Peyton Manning’s hands. Although it was, statistically, the correct call, it didn’t work out.
Because it didn’t work out, Belichick received withering criticism. For example, Dan Shaughnessy of The Boston Globe called the decision “ghastly” and a “gaffe unrivaled.” NBC analyst Rodney Harrison, who had played for Belichick, called it “the worst coaching decision I’ve ever seen Bill Belichick make.” Jay Mariotti said the call was “inexplicably arrogant” and “football suicide.” On SportsCenter, Trent Dilfer called the decision “ludicrous” and “absolutely ridiculous.” Jim Litke of the AP referred to it as “a reckless gamble.”
On the other hand, had the decision turned up roses, it would surely have been hailed as brilliant and courageous. That’s outcome bias.
In one representative study, 77 percent of entrepreneurs in charge of failed start-ups believed – before the failure – that their company would grow into a successful business. After they failed, only 58 percent said they had originally believed their company would be a success.
Salena Zito made her name with sympathetic (perhaps too sympathetic) profiles of Trump voters that gained the president’s approval. In September of 2016, she famously observed that, when Trump says something obviously false, “the press takes him literally, but not seriously; his supporters take him seriously, but not literally.”
Zito also had a carefully constructed narrative to set herself apart: “When I cover politics, I don’t fly. I only take back roads. I stay in a bed and breakfast. That’s why I had a different take.”
After the election, Politico called Zito “a reporter who saw Trump’s victory coming from miles away.” Zito didn’t disagree. “Everybody, everybody thought I had lost my mind,” she said (emphasis in original). As she carefully explained: “I essentially said, ‘This race is over, nobody knows it yet.’”
She “knew it all along.” Except she didn’t.
Despite Zito’s subsequent claims (and higher profile jobs and book deal), she didn’t really predict a Trump victory beforehand. The alleged prediction was a July, 2016 column that said the election “might be different” and that Trump “can win” Pennsylvania and thus the election. She had been similarly optimistic about the McCain and Romney candidacies. Indeed, Zito devoted one of her last columns before the 2016 election to explaining why “[a] Trump defeat will be incredibly difficult for his supporters to accept.”
It was only afterward that she said she saw it coming “from right before the convention.”
She “got the feeling that this would go for Trump.”
A few people did get it right. They knew it all along.
Even when they didn’t.
It’s a weekend tradition of sorts. On Friday evenings when I’m home, I outline what chores I plan to get done around the house on Saturday and what I need to buy to complete them. Inevitably, however, it takes a lot longer to do my chores than I expect and I routinely need to make multiple visits to Home Depot for supplies because I’m missing something (or multiple somethings). And the results aren’t often great.
Nearly all of us overrate our own capacities and exaggerate our ability to shape the future [insert discussion of Friedrich Hayek’s warnings about the folly of planning here].
British Prime Minister Lord North, on dealing with the rebellious American colonies, in 1774 exclaimed that “Four or five frigates will do the business without any military force.” As he watched German troops heading off to fight in World War I in August of 1914, Germany’s Kaiser Wilhelm II was convinced that, “You will be home before the leaves have fallen from the trees.” Three days before Pearl Harbor, U.S. Secretary of the Navy, Frank Knox, offered his assurance that “No matter what happens, the U.S. Navy is not going to be caught napping.” In 1958, Soviet leader Nikita Khrushchev claimed that the U.S.S.R. would destroy western capitalism economically: “We will bury you.”
A wide swath of research shows that we tend to have unreasonably optimistic expectations about the future. That flaw, per Kahneman, is the planning fallacy, which is our tendency to underestimate the time, costs, and risks of future actions and at the same time to overestimate the benefits thereof. It’s at least partly why we underestimate bad results. It’s why we think it won’t take us as long to accomplish something as it does. It’s why projects tend to cost more than we expect. It’s why the results we achieve aren’t as good as we expect. It’s why we’re dead certain even when the actual results are guaranteed to be, as Ben Carlson explains, “more than never, less than always.”
The planning fallacy projects our fanciful and self-serving renderings forward with the idea that the future can somehow be managed — and perhaps controlled — despite the lack of any actual historical support for the notion. As Adam Gopnik sagely points out, “[w]hat history generally ‘teaches’ is how hard it is for anyone to control it, including the people who think they’re making it.” Indeed, “the best argument for reading history is not that it will show us the right thing to do in one case or the other, but rather that it will show us why even doing the right thing rarely works out.”
Kahneman’s conclusion is a depressing one indeed, especially when we carefully consider its implications. “Most of us view the world as more benign than it really is, our own attributes as more favorable than they truly are, and the goals we adopt as more achievable than they are likely to be.”
Study into the planning fallacy caused Oxford’s Bent Flyvbjerg to establish what he calls “the iron law of megaprojects” – they will be over budget, over time, under benefits, over and over again. But it does have some practical utility. As Kahneman notes, “If you realistically present to people what can be achieved in solving a problem, they will find that completely uninteresting. You can’t get anywhere without some degree of over-promising.” It’s much easier to get forgiveness than permission.
This overriding problem is why I take three trips to Home Depot on Saturdays and why it takes me all day to finish a household chore I expected to take maybe an hour (which then doesn’t look right or work right). As John Lennon explained, “Life is what happens to you while you’re busy making other plans.”
Things rarely turn out the way we expect. We never have everything covered. Life happens. The future remains uncontrolled. None of knows about tomorrow.
Planning is guessing.
“We may be through with the past, but the past ain’t through with us.”
Why do men find younger women more desirable than older women? Evolutionary psychologists tell us that younger women are more likely to be fertile and healthy, and their progeny are thus more likely to arrive and survive. Accordingly, per this thinking, male preferences are driven by genetics, through the mechanism of evolution by natural selection during the Pleistocene era, to optimize the chances of their genes being preserved.
This claim has a certain plausible elegance to it. It may even be true, despite its consistency with common stereotypes. However, despite the scientific imprimatur with which it is offered, it is — nonetheless — a just-so story, without empirical evidence to support it. The interior lives of our progenitors left no fossil record. As Stephen Jay Gould recognized, linking the behavior of humans to their evolutionary past is fraught with peril, “not least because of the difficulty of disentangling culture and biology.”
On account of the narrative fallacy, we insist on plunging ahead anyway. Even scientists.
The narrative fallacy is our general inability to look at sequences of facts or events without weaving an explanation into and around them. Explanations bind facts together in our minds and memories. The fallacy part, as the semanticist Alfred Korzybski reminded us, is that “The map is not the territory.” Korzybski emphasized that symbols (“maps”) are not the things symbolized (the “territory”). “Those who rule the symbols, rule us.”
Ursula K. Le Guin’s opening line of her science fiction classic, The Left Hand of Darkness, is brilliant: “I’ll make my report as if I told a story, for I was taught as a child on my homeworld that Truth is a matter of the imagination.” Her “homeworld” is more like ours than we’d care to admit.
The story may be apocryphal, but Maria Konnikova describes the French poet Jacques Prévert improving a blind beggar’s signage and fortunes. In place of “Blind man without a pension,” Prévert flipped the sign over and wrote, “Spring is coming, but I won’t see it.”
Don Draper doesn’t close the deal with facts — he “plays a memory” (and the irony is all the richer because the story he tells is utterly fake).
With stories, we’re “easy to convince.”
We want to be entertained.
“Here we are now, entertain us.”
As his handler Fuches reminds contract killer Barry in the HBO series of the same name, the real-life William Wallace didn’t give that iconic speech we watched and remember in Braveheart. People, he said, “don’t want honest. They want entertainment.” As Fuches repeatedly insists, “I guess everyone’s a hero of their own story, right?” Series star Bill Hader notes, “They want to see the thing that they’re not.”
In this sense, we are all novelists, who always put the best “faces” on our behavior. Stories get in the way. We’re all pretenders.
Even Erasmus recognized that most Christians will learn nothing from a sermon but remember a story told by the preacher.
The artistic impulse, (in a bit of delicious irony, this idea is often falsely attributed to Mark Twain), is never to let the facts get in the way of a good story. Significantly, Lin-Manuel Miranda’s Hamilton is consistently wise to this problem in that the characters (and George Washington especially) are aware that who tells the story is crucial to how one is perceived and remembered.
Stories are a “land of make believe,” both literally and figuratively.
Released 20 years ago, Peter Weir’s brilliant film, The Truman Show, may be the work of art that best predicted the 21st century. It stars Jim Carrey as Truman Burbank, adopted and raised by a corporation inside and enclosed by a reality television show of his life, until he discovers the charade and decides to escape. A taut, dark, yet funny masterpiece whose prescience only deepens with time, Truman’s entire existence since the womb has been televised, and all the people in his life have been paid actors.
The show’s creator and director, Christof (Ed Harris), was a prophet when he proclaimed that, “We’re tired of actors giving us phony emotions … [The Truman Show] is not always Shakespeare, but it is real,” or as real as most Instagram accounts, anyway. When the film began showing in theatres, people doubted that anyone would watch Truman’s banal life. Today, few doubt that many would choose to be Truman.
As he discovers the truth about his existence, Truman fights to find an escape – through a door the looks like the sky – from those who have always controlled him and his simplified and utterly phony reality. Like Truman, we all struggle with the narratives we seem to have been given. We are all, as Devorah Goldman argues, balancing our roles as “creator and created, limited by circumstances and nature but bent on inventing ourselves and managing how we are perceived.” As Christof explained, “We accept the reality of the world with which we are presented. It’s as simple as that.”
The world to which Truman “escaped” 20 years ago was significantly different from today’s world. It isn’t altogether clear that if Truman escaped today he would see much difference between his prison and his new life.
We like to think that we are like judges, that we carefully gather and evaluate facts and data before coming to an objective and well-founded conclusion. Instead, we are much more like lawyers, grasping for any scrap of purported evidence we can exploit to support our preconceived notions and allegiances. Doing so is a common cognitive shortcut such that “truthiness” – “truth that comes from the gut” per Stephen Colbert (the video can’t be embedded; watch it here) – seems more useful than actual, verifiable fact. What really matters is that which “seems like truth – the truth we want to exist.” That’s because, as Colbert explained, “the facts can change, but my opinion will never change, no matter what the facts are.”
Reality is messy. Stories? Not so much. Stories can never be truly ordinary or they wouldn’t resonate. There needs to be narrative arc or we won’t stick around. Our lives are ordinary and we want to experience the extraordinary, to imagine being extraordinary. If characters in a story are too real, it wouldn’t be a very good story.
And we all love a good story.
We run with herds…
…jump on bandwagons…
…and latch on to fads, willingly – eagerly! – dispensing with good sense on the way.
We gather in packs, get in with the in-crowd…
We eagerly denounce others who pick their teams and tribes over principle and even over objective truth, but are also comforted to know that our own side “has our back.” Nuance and ambiguity are not welcome in a tribal world – dogma only. No arguments – just (confrontational) assertions. Only superheroes and villains exist in a vortex of grievance. We “are innately tribal, psychologically primed to recognize in-group and out-group before the frontal cortex gets a look-in.”
We all want to stand against the tide and for the truth — to be contrarian individualists (like everyone at Warren Buffett’s annual love-in). Alas, that doesn’t happen all that often.
Research suggests that while few Americans have a consistent ideology, our partisan identity remains consistent and is very important to us. “After deciding that we are Republicans or Democrats, we start to also call ourselves conservatives or liberals, even if we have little understanding of what those terms mean.” All of which explains why conservatives are now called RINOs (Republicans In Name Only) for expressing conventionally conservative views the current Republican president rejects.
We are inveterate herders. As the great Steven Stills sang (and wrote):
What a field day for the heat
A thousand people in the street
Singing songs and carrying signs
Mostly say, ‘hooray for our side’
Hooray! Everybody loves a winner. And we eagerly jump on board.
The movie comedy Mike and Dave Need Wedding Dates (language very NSFW) opens with a montage of the title characters at various events, laughing, hanging out, and having a good time. Everyone around them is enjoying their funny antics.
When their parents stage an intervention of sorts and tell them how “you two ruin” every family gathering, the boys are baffled as they relate how everyone loves having them around. In response, their parents show a video of the real (rather than misremembered) events wherein the brothers ended up causing injury and massive property damage while destroying the family get-togethers. Mike and Dave are genuinely astonished by this, asking “where are the epic tracking shots of smiling faces” that they remember.
Mike and Dave have been taken in by self-serving bias, our propensity to attribute positive outcomes to skill and negative outcomes to luck. In other words, our successes reward out efforts while our failures are someone else’s fault or simply bad luck. Thus, as with the fundamental attribution error, successes are outcome-based and failures are judged by our intentions.
In the same way, coaches, players, and fans attribute wins to great coaching, great players, and excellent preparation. Losses are caused by bad officiating (language NSFW).
The Houston Astros cheated like crazy in 2017 and won the World Series. Insanely, the Astros’ owner insists that because his team won it all, the cheating didn’t impact the outcome. Somehow. He couldn’t have been any more self-serving.
Ray Dalio is a hedge fund billionaire, who sees himself as a genius and a hero. His firm is bizarre, secretive, and hugely successful, and its employees all rate one anothers’ believability pretty much constantly while algorithms control the firm’s trading. The essence of Dalio’s philosophy is that firms and groups need “idea meritocracy” and a “radical transparency” such that everybody is free to challenge everybody else without fear such that “the best ideas win.”
I have no doubt that the leadership within Dalio’s firm routinely tells subordinates exactly what they think about their ideas. However, I have always been suspicious of how often Dalio’s “free exchange of ideas” works from the bottom up. No matter what Dalio insists, I wouldn’t expect (consistent with self-serving bias) those without power within the organization to challenge the elites very often. Now it seems clear my suspicions have a basis in fact. As Dalio recently told a subordinate who disagreed with him, “If you’re so smart, why aren’t you rich?”
A famous political figure has the same sort of problem.
Drivers on the freeways of Southern California, where I live, routinely deal with aggressive drivers. Even if I am paying careful attention, another driver will often enter my lane, seemingly out of nowhere, and cut me off. Sometimes these drivers will even share a not-so-friendly finger with me after they have cut me off. When something like that happens, I react poorly. I get angry. I rave. I lean on my horn.
The other drivers’ actions are all that matters. They are without excuse.
When I cut someone off, on the other hand, I have excellent justification and it controls. I didn’t see my turn or the other driver. Or I’m late. Or I got in the wrong lane by mistake (sort of).
When another driver cuts me off, the action itself establishes that he’s an idiot. When I cut somebody off, my intentions (or plausible intentions) control my moral evaluations justifications and exonerate me — at least in my head.
That’s self-serving bias.
We all like to think that we live in Lake Wobegon, where all the women are strong, all the men are good-looking and all the children are above average. Men — good-looking or not — seem particularly prone to the overconfidence problem, based upon how often they hit on women who are way out of their league or how infrequently they ask directions.
In street parlance, “hero ball” is an epithet applied when a basketball player tries to take over a game as if he’s Michael Jordan (think Russell Westbrook). In the 1987 NBA Playoffs, renowned playmaker Magic Johnson – perhaps the greatest passer of all-time – went “hero ball” against a triple-team of Hall-of-Famers to beat the Larry Bird-led Celtics in the 1987 NBA Finals, ignoring a wide-open Hall of Famer with a much better shot in the process.
Even though Magic made the shot, it’s a classic instance of overconfidence, even from a player with good reason to be supremely confident. It’s what legendary basketball impresario Pat Riley (also then-Lakers coach) calls “the disease of me.“1
Your mother may think you’re special. She probably does. You will almost certainly agree. Yet it’s highly unlikely that the world-at-large is quite so credulous.
Most of us think we are significantly better than average at most things (also known as illusory superiority). We’re also susceptible to the “endowment effect,” which describes the extra value we place on things just as soon as they become ours. One study even found that asking participants to imagine that a theory is their own biases them to believe in its truth. Charles Darwin nailed it: “ignorance more frequently begets confidence than does knowledge.”
A closely related problem is the Dunning-Kruger effect, whereby unskilled or even incompetent people fail to recognize their limitations. As David Wallace noted, “Many writers believe they are destined for greatness, and almost as many are wrong.”
The “fine” arts routinely offer up the likes of Raymond Roussel – “I was carrying the sun within myself and could do nothing to impede the tremendous light I was radiating” – and Florence Foster Jennings as wonderful yet dreadful exemplars of our overconfidence.
Even if we aren’t quite as bad as Mrs. Jennings, like Yogi, we all think we’re “smarter than the average bear” and better than we really are (for example, most millennials think that they will someday be rich). But there’s precious little evidence to support it all too often.
In one classic study, for example, 94 percent of professors rated themselves above average relative to their peers. Where were they when I was in college? In another, 93 percent of American drivers rated themselves above average, but are never on the freeway when I am.
Conversely, we all tend to think that our s*** doesn’t stink – figuratively and literally. ASAPScience confirmed in a blind smell test that we actually do like the smell of our own because the bacteria which creates the smell is unique to each person. On the other hand, when you smell someone else’s, your brain detects it as something that is trying to harm you.
The wife of a famous mountaineer, killed in an avalanche, was asked if she and her husband had talked about the risks he faced. “Oh yes, we talked about it,” she said. “I was aware from the very beginning. I fully accepted the possibility that this could happen. But you can’t really prepare for it. There’s this belief that it’s not going to happen to you.”
That’s true, right up until it does happen to you, and that’s overconfidence bias.
1 Ironically, without sufficient confidence, and what is almost surely statistical overconfidence, nobody would start a business, run for president, or ask anyone out. As Wayne Gretzky famously said, you miss 100 percent of the shots you don’t take, which explains why a recent study found that users of online dating sites spend most of their time trying to contact people significantly out of their league. In too many instances, overconfidence trumps competence.
“I don’t remember the date of our title in high school. I don’t remember the date of when we won the NCAA tournament. But I can tell you when I got hurt. I can tell you when we lost. I can tell you the score. I can tell you, like, the play. I can see every play of what happened at the end of the Phoenix game when we lost on the last-second shot. Like, I can see every play that we made leading up to that. I don’t know. It just sticks out more.”
Fear and horror are what anthropologists call biocultural. We all carry the same sorts of fears. Note how Ray Bradbury described one of his characters: her heart “was a bellows forever blowing upon a little coal of fear … an ingrown light which her inner eyes stared upon with unwanting fascination.” As M.M. Owen noted, “we are creatures formed in no small part by the things to which we are averse.”
Fear rarely considers the long-term; it is usually fixated on the here and now and how it could all go badly wrong in a hurry. It is steeped in worry, especially of the unknown. Even good times are a calm before a terrible storm. No matter how good things may be, all isn’t as it seems. It’s easy to smell disaster and feel it lurking.
As Jeremy Siegel explains, “Fear has a greater grasp on human action than does the impressive weight of historical evidence.”
“Doctor, my eyes have seen the years / And the slow parade of fears.”
We all face fear. “Forty-five on the back of the jersey upon your soul.”
Every civilization believes itself on the brink of cataclysmic change. Every election is of the “Flight 93” variety. The apocalypse is always nigh. As Robert Louis Stevenson (is said to have) explained, “Sooner or later, everyone sits down to a banquet of consequences.”
Those overarching fears are at the root of our loss aversion – our preference for avoiding losing as opposed to equivalent gains. Duke Hall of Fame basketball coach Mike Krzyzewski adds, “Winners hate losing more than [they love] winning.”
PGA Tour golfers expend more effort on par putts than on birdie putts and, controlling for distance, have greater success with par putts. Being human, professional golfers are loss averse. Four-time major champion Brooks Koepka recognized: “Sometimes it’s just about how few bogeys and doubles you make.”
Per Kahneman, “the main contribution that Amos Tversky and I made during the study of decision making is a sort of trivial concept, which is that losses loom larger than gains. …As a very rough guideline, if you think two to one, you will be fairly close to the mark in many contexts” (some have questioned this finding, especially recently, but there are good reasons for thinking that questioning is in error, reasons that every financial advisor has seen among his or her clients).
As Tom Petty poignantly sang, “Yeah, and it’s over before you know it / It all goes by so fast / Yeah, the bad nights last forever / And the good nights don’t ever seem to last.”
Andre Agassi explained it well: “A win doesn’t feel as good as a loss feels bad, and the good feeling doesn’t last as long as the bad. Not even close.” David Letterman did too: “Maybe life is the hard way, I don’t know. When the show was great, it was never as enjoyable as the misery of the show being bad.”
Because we innately prefer safe to sorry, we often choose secure over sensational, leaving us both sorry and safe. That’s loss aversion.