Note: This post is a much re-worked and expanded version of prior posts.
Exactly eighteen years ago, on another day that lives in infamy, at a little before 6:00 a.m., Pacific Time, I was sitting in front of my Bloomberg terminal in downtown San Diego when the first, cryptic hints of trouble at the World Trade Center crawled across the bottom of my screens (I think). As the day’s events unfolded, I recalled having been on the phone on the cavernous Merrill Lynch fixed income trading floor at the World Financial Center, connected to WTC by underground walkways, doing a STRIPS trade with a client (really, an institutional “account”) who was sitting high atop the World Trade Center (2WTC), when I heard and felt the February 26, 1993 World Trade Center bombing.
I had been scheduled to fly to New York on business on September 10, 2001 and had reservations for the week at the Marriott World Trade Center (3 WTC), which would be destroyed when the Twin Towers collapsed later that day. Instead, I decided to stay home and go to “Back to School Night” at my kids’ school.
I am really glad I didn’t get on that plane to New York.
My little story is insignificant within the context of the tragic losses, terrible evil, and timeless heroism of the “American epic” to which that day bore inexorable witness. But it is what happened to me. It was my story. It provides context and a framing device to help me remember, think about what transpired, and what it means. It is emotional to think about still. But many other stories are far more important.
The image reproduced above is central to several converging stories from that dreadful, terrible day.
The photograph, taken at the Brooklyn waterfront during the afternoon of September 11, 2001, by German photographer Thomas Hoepker, is now one of the iconic images of that day. The Observer New Review (London) republished it in 2011 as the 9.11 photograph. As you can see, it shows a group of people sitting in the sun in a park in Brooklyn while, in the background, a cloud of dust and smoke rises above lower Manhattan from the place where two hijacked airliners crashed into the Twin Towers that morning, soon causing them to collapse, killing nearly 3,000 people.
In Hoepker’s words, he saw “an almost idyllic scene near a restaurant – flowers, cypress trees, a group of young people sitting in the bright sunshine of this splendid late summer day while the dark, thick plume of smoke was rising in the background.” By his reckoning, even though he had paused for but a moment and didn’t speak to anyone shown in the picture, Hoepker was concerned that the people in the photo “were not stirred” by the events at the World Trade Center – they “didn’t seem to care.”
Hoepker published many images from that day, but he withheld this picture for over four years because, as he claimed much later, it “did not reflect at all what had transpired.” Still, in 2005, he chose it as the catalogue cover for his 9.11 retrospective in Munich. In 2006, Hoepker’s image was published here in the United States in David Friend’s book, Watching the World Change (as shown above). As Friend wrote, “it did not meet any of our standard expectations of what a September 11th photograph should look like” (emphasis in original).
As reported by The Wall Street Journal, Hoepker later began spinning a more expansive tale, comparing his photo to Breugel’s “Landscape with the Fall of Icarus,” a Renaissance painting in which a peasant continues plowing, seemingly unaware or unconcerned by the body of a boy (lower right in the image, below) plummeting from the sky into the sea. As W.H. Auden penned in his poem, Musée des Beaux Arts: “In Breughel’s Icarus, for instance: how everything turns away / Quite leisurely from the disaster.”
Later that year, Frank Rich wrote a 9.11 fifth anniversary column for The New York Times, framed by the Hoepker photograph, which he called “shocking.” Rich claimed that the five New Yorkers shown were “relaxing” and were already “mov[ing] on” from the attacks. Rich described them as being on “what seems to be a lunch or bike-riding break, enjoying the radiant late-summer sun and chatting away as cascades of smoke engulf Lower Manhattan in the background.” The Rich narrative then draws a remarkable conclusion, one that isn’t nuanced or qualified in the slightest.
“Mr. Hoepker’s photo is prescient as well as important – a snapshot of history soon to come. What he caught was this: Traumatic as the attack on America was, 9/11 would recede quickly for many. This is a country that likes to move on, and fast. The young people in Hoepker’s photo aren’t necessarily callous. They’re just American.”
In Rich’s view, America’s desire quickly to move on from the 9.11 tragedy, foreshadowed by Hoepker’s image, meant we didn’t learn any lasting lessons from it. It was an easy and plausible – if utterly speculative – interpretation based upon the image alone, but wasn’t supported by any substantive evidence. However, and more importantly, as interpreted, it framed Rich’s desired narrative perfectly.
Even though a picture may well be worth a thousand words (1,506 in Rich’s case), those words needn’t be accurate. Photographs necessarily capture just a moment in time and, as such, they need context and narrative to be understood, even if the understanding is utterly false. Moreover, unlike other artistic constructs, the Hoepker photograph has an actual context and shows real people in real life. It can be (and, in this case, was) interpreted erroneously.
As Henry David Thoreau put it, “The question is not what you look at, but what you see.” As Malachy Tallack explained, “For as long as people have been making stories, they have been inventing islands.” Or, as Jason Zweig pointed out, “In investing, as in life, too many people confuse wishes for beliefs [and] beliefs for evidence.”
Journalist Daniel Plotz quickly came forward with an alternative interpretation of the photograph that disputed Rich, calling Rich’s reading of the image a “cheap shot.” In Plotz’s view, the five people in the picture had not ignored or moved beyond 9.11 but had “turned toward each other for solace and for debate.” To his credit, Plotz emphasized that he didn’t “really know” what the pictured people were doing or feeling and called upon them to contact him so as to set the record straight. Two did, and they repudiated Rich’s narrative in the strongest of terms.
The first to respond was Walter Sipser, a Brooklyn artist and the man on the far right in the shot. “A snapshot can make mourners attending a funeral look like they’re having a party,” he wrote. “Had Hoepker walked fifty feet over to introduce himself he would have discovered a bunch of New Yorkers in the middle of an animated discussion about what had just happened.”
Chris Schiavo, a professional photographer, Sipser’s then-girlfriend and second from the right in the photograph, also responded. She criticized both Rich and Hoepker for their “cynical expression of an assumed reality.” As a “third-generation native New Yorker, who knows and loves every square inch of this city,” whose “mother even worked for Minoru Yamasaki, the World Trade Center architect,” she stated that “it was genetically impossible for [her] to be unaffected by this event.”
Not only were Sipser and Schiavo turned into “national symbol[s] of moral disgrace” by Frank Rich, they suffered that fate under false pretenses. Hoepker and Rich, as we are all prone to do, interpreted the picture for their own purposes with a false certainty that neither the evidence nor common decency required or even supported. Their desired narratives simply carried the day, facts notwithstanding. A decade after 9.11, long after he should have been aware of his mistake, Rich was still pumping the same false narrative: “Now, ten years later, it’s remarkable how much our city, like the country, has moved on.”
Early critics of photography complained that the new technique was “too literal to compete with works of art” because it was unable to “elevate the imagination.” Not everyone shared that view. For the dissenters, now a huge majority, the camera has always been the pencil of art. Photographs are, of necessity, strictly accurate, but they needn’t be true because they are devoid of context and scope, thus offering an image no less subjective than the art of the Dutch realists or even Monet’s waterlilies. And it is no less amenable to narrative, as Hoepker and Rich should have recognized.
We love stories, true or not, almost from the cradle. Stories are crucial to how we make sense of reality. They help us to explain, understand, and interpret the world around us. They also give us a frame of reference we use to remember the concepts we take them to represent. Whether measured by my grandchildren begging for one (or “just one more”), television commercials, the book industry, data visualization, series television, journalism (which reports “stories”), the movies, the parables of Jesus, video games, cable news opinion shows, or even country music (“every song tells” one), story is perhaps the overarching human experience. It is how we think and respond. We are inveterate “storyvores,” desperate for a rich diet of narrative.
Narratives need heroes, villains, protagonists, character arcs, redemption, vindication, and meaning, all of which can overshadow or obscure what is unequivocally real, like facts and data. As the Nobel laureate Daniel Kahneman has noted, “Evidence is not all that compelling.” Near the end of her wonderful novel, Housekeeping, Pulitzer Prize winner Marilynne Robinson made the same point when she observed that, ”Fact explains nothing. On the contrary, it is fact that requires explanation.”
The best stories are simple, easily communicated, easily grasped and easily remembered. Because we inherently prefer narrative to data, our understanding is routinely diminished. To do math, neither maturity nor knowledge of human nature and experience are required. All that is needed is the ability to perceive patterns, logical rules and linkages. But because of the enormous sets of random variables involved in real life, patterns, logical rules and linkages alone do not solve many actual puzzles. Correlation does not imply causation. Information may be cheap,1 but meaning is expensive and elusive. Insight is priceless.
There is a complex relationship between information and meaning, just as there is between science and philosophy, and between math and language. Language is needed to make sense of math, philosophy is required to put science into proper context, and meaning is required to make information useful. Thus, all are essential to human advance. Information, science, and math provide ways and means for answering what and how questions. Meaning, philosophy, and language aim for questions of why and the “so what.” Why do we ask the questions we do? Why can’t we answer certain questions? Why are questions and answers important in the first place? What does it mean? Why does it matter? What should we do about it? Meaning and metaphor – the essence of stories – are always central to who we are, what we think, what we believe, and what we do.
Whether we like it or not, stories are really, really powerful. National Geographic, for example, “believe[s] in the power of science, exploration and storytelling to change the world.” An oft-cited Ohio State study found that a message in story form is up to 22 times more memorable than disconnected facts alone.
As explained by mathematician John Allen Paulos, “There is a tension between stories and statistics, and one under-appreciated contrast between them is simply the mindset with which we approach them. In listening to stories we tend to suspend disbelief in order to be entertained, whereas in evaluating statistics we generally have an opposite inclination to suspend belief in order not to be beguiled.” The Significant Objects Project allowed researchers to sell worthless baubles on eBay for surprising amounts when they linked each one with a compelling story.
We are hardwired to respond to story such that a good story doesn’t feel like a story – it feels exactly like real life, but is most decidedly not like real life. It is heightened, simplified, and edited. We prefer rhetorical grace and an emotional charge to precise linearity and the work of hard thought. Because we are inveterate simplifiers, we prefer clean and clear narrative to complex reality. As Hannah Arendt explained, “storytelling reveals meaning without committing the error of defining it.”
Reality is messy. Stories? Not so much. Stories can never be truly ordinary or they wouldn’t resonate. There needs to be narrative arc or we won’t stick around. Our lives are ordinary and we want to experience the extraordinary, to imagine being extraordinary. If the characters in a story are too real, it wouldn’t be a very good story.
Our often irrational, narrative-drunk brains aren’t all that interested in reality. The great actor and filmmaker Orson Wells threateningly intoned against the narrative overlay of pretty much all we do and are in the sort-of documentary F For Fake: “almost any story is almost certainly some kind of lie.” Much like Daniel Defoe’s famous description of the novel as “lying like truth,” Robert McCrum says, “The moral drive of fiction is faithfully to ‘get it right’ through the contrivance of making it up.”
Karl Popper pretty well demolished the popular notion that history is a narrative, that it has a shape, a necessary progression, and follows laws of development. History is not heroic. But we believe that is (or devoutly wish to believe that it is) nonetheless.
Stories are culture’s way of teaching us what is important. They are what allow us to imagine what might happen next – and beyond – so as to aspire to it or prepare for it. Indeed, we are always anxious to know what is going to happen (which goes a long ways toward explaining binge-watching and will-they-or-won’t-they couples). As series co-creator David Lynch famously said, Twin Peaks never intended to solve the mystery of who killed Laura Palmer. Her murder served as the entry point to a much deeper and creepier story. When the killer was revealed partway through the second season at the behest of ABC – when we found out what happened – the show suffered a big drop in ratings and was cancelled.
Because it feels so true (“It can’t be wrong when it feels so right”), it isn’t hyperbole to say you’ve been lost in a story. Story turns us into willing students, eager to learn the story’s message. It’s how we sift through the raw data of our lives to ascertain what matters. Our brains are designed to analyze the environment, pick out the important parts, and use those bits to extrapolate linearly and simplistically about and into the future. The common reminder to public speakers that audiences won’t remember what they say but will remember how they were made to feel offers an important truth. We may learn facts and data but we feel stories, and that makes all the difference.
Aaron Sorkin has won an Oscar and multiple Emmys for his writing. He is one of the great storytellers of our time. As he has pointed out, “The value of storytelling is – in whatever form, whether it’s film, television, a play, a book, or a song – the same as it’s always been for thousands of years, which is that it is the most powerful delivery system ever invented for an idea.” As Kahneman emphasizes, “When you’re talking to the public at large, and you want to get action or you want to get something embraced and so on, you…have to have a story that is engaging that people can relate to.”
Great stories offer an illusion of accuracy and completeness, justified or not. Whether it’s with pro wrestling and politics or, less obnoxiously, in “based on a true story” movies or the arts generally, occasional inaccuracy (clarification, augmentation, creative license, etc.) isn’t quite denied. It is an essential part of the spectacle. Distinctions and differences from linear fact, as a whole, are a feature of good storytelling, not a bug. In other words, where literalists see red flags, artists see a parade. Still, there is a huge difference between, on the one hand, story, spin, or packaging and, on the other, falsehood, deception, or obfuscation.
Sometimes the story illusion is self-delusion. There is good evidence that narrative formation is one of the primary functions of consciousness. In this sense, we are all novelists, who put the best “faces” on our behavior. We generally try to force everything that happens into a single unique and coherent story – an autobiography – in which we are the hero. It is social media’s primary raison d’etre.
Within this construct, narrative creation is a basic feature of our minds, the “inescapable frame of human existence.” We continually shape and re-shape our lives, ourselves, and our views of the world through the stories we tell ourselves and others. As the “poet laureate of medicine,” Oliver Sacks explained, “Biologically, physiologically, we are not so different from each other; historically, as narratives – we are each of us unique.” We all tend to think that we’re the only exception.
Released 20 years ago, Peter Weir’s brilliant film, The Truman Show, may be the work of art that best predicted the 21st century. It stars Jim Carrey as Truman Burbank, adopted and raised by a corporation inside and enclosed by a reality television show of his life, until he discovers the charade and decides to escape. A taut, dark, yet funny masterpiece whose prescience only deepens with time, Truman’s entire existence since the womb has been televised, and all the people in his life have been paid actors. The show’s creator and director, Christof (Ed Harris), was a prophet when he proclaimed that, “We’re tired of actors giving us phony emotions … [The Truman Show] is not always Shakespeare, but it is real,” at least as real as most Instagram accounts.
As he discovers the truth about his existence, Truman fights to find an escape – through a door the looks like the sky – from those who have always controlled him and his phony reality. Like Truman, we all struggle with the narratives we seem to have had foisted upon us. We are all, as Devorah Goldman argues, balancing our roles as “creator and created, limited by circumstances and nature but bent on inventing ourselves and managing how we are perceived.” As Christof explains, “We accept the reality of the world with which we are presented. It’s as simple as that.”
The world to which Truman escaped (circa 1998) was significantly different from today’s world. It isn’t altogether clear that if Truman escaped to 2018 he would see much difference between his prison and his new life. When the film began showing in theaters, people doubted that anyone would watch Truman’s banal life. Today, many choose (or would love to choose) to be Truman. We are more anxious to live in a “storified” world today than ever before.
Our thinking is governed by and about beliefs, reasoning, motivations, expectations, planning, and pain. Our brains are jazz musicians of a sort, improvising what it thinks are the best thoughts and behaviors in and for the moment. We are incessant but inconsistent story-spinners. We are not detached reporters of either the outer or our inner worlds. Our brain acts far more as story-spinner than as objective reporter – more tabloid than academic research paper. However, we are such good story-spinners that we convince others – and ourselves too – that we’re reporting fact rather than creating fiction on the fly, making it up as we go along.
We think of our brains as linear processors of information. They aren’t. We are always wondering how the information we’re receiving fits together. Our stories provide coherence and overarching meaning, the basis for creating the desired connections. Accuracy is an only occasional by-product.
The keys to acquiring knowledge are collecting facts, drawing connections, making inferences, and predicting outcomes: the scientific method, broadly construed. The problem is that our brains insist on finding links and patterns whether they truly exist or not. We’re biologically inclined to reduce complex events to a simpler, more palatable, more easily understood pattern – in other words, a story. Accordingly, the availability heuristic pushes us to make predictions and inferences based on what most quickly comes to mind and what is most easily remembered. Hindsight bias pushes us to interpret past events as obviously and inevitably causing future ones. Consistency bias causes us to reinterpret past events and behaviors to be consistent with new information. And so on.
Ultimately, the key to a good story isn’t just what happens or to whom it happens. As Roger Ebert so eloquently put it, broadened ever so slightly, a story “is not about what it’s about; it’s about how it’s about it.” Stories are about how the protagonist changes and how we react to those changes and ourselves change. We can “see” the world as it isn’t (yet) but as it might become.
The risks of turning everything into a story and the deceptions entailed by a good story have necessary consequences. As Nassim Taleb has thoughtfully explained, the narrative fallacy addresses our limited ability to look at sequences of facts without weaving an (often erroneous) explanation into them or, equivalently, forcing a logical link, an arrow of relationship upon them. Explanations bind facts together. They make them all the more easily remembered; they help them make more sense. Where this propensity often goes wrong is when it increases our impression of understanding. Or when we have the facts wrong.
Which brings us back to Frank Rich.
Five years after the towers came down, Frank Rich had a story to tell. It was a story of a “divided and dispirited” America that had lost touch with the horror of 9.11, of a forgetful nation desperate to move on, a divided nation insufficiently stirred. It was also and crucially the story of a callow, fear-mongering President with a selfish and secret partisan agenda far removed from committed sacrifice for the common good. It was a story of a once-great country that had moved on but not ahead. And he thought he had found the perfect picture to illustrate that story.
As a journalist, Rich had an obligation to check the facts supporting his story. By all appearances, he did not. Whether he tried to or not, the alleged facts upon which his story was predicated were blatantly and dreadfully wrong. That was an egregious error, an error that would make him look silly when the truth came out, as it so often does.
Many of our foibles (narrative and otherwise) are the result of our laziness. Sometimes the laziness is overt. Other times it is simply a function of the various shortcuts we take, sometimes understandably, to make life more manageable. Sometimes our stories are “too good to check.” Sometimes the point of the story matters so much to the author that its facts do not. Whatever the reason(s), Rich perverted the truth of what the Hoepker photograph actually portrayed. His “horrid” facts, necessary for his purported conclusion, were wrong, plain and simple.
Based on Jane Austen’s epistolary novel, Lady Susan, eventually published more than 50 years after her death, Love and Friendship is a terrific film. This witty black comedy of machinations and manners feels like it has more in common with Oscar Wilde than the author of Sense and Sensibility. “Facts are horrid things,” Lady Susan (played by the wonderful Kate Beckinsale) shrugs in one scene (directly following the book), dismissing culpability for her latest scandalous actions, which include sleeping with someone else’s husband. Hearing Lady Susan spinning her having been unmasked as a duplicitous scoundrel as everyone’s shame but her own is a masterful and, in context, hysterical tour de force. At least she didn’t read other people’s correspondence. Mere “horrid” facts will not deter her.
By comparison, consider the American patriot who would become our second president. When making his legal defense of British soldiers during the Boston Massacre trials in December of 1770, John Adams (played by the excellent Paul Giamatti in the clip below from the fine HBO miniseries, John Adams, based on David McCullough’s Pulitzer Prize-winning biography) offered a now famous insight: “Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of facts and evidence.” In a similar vein, Sen. Daniel Patrick Moynihan once said that “[e]veryone is entitled to his own opinion, but not to his own facts.”
Facts, then, can be either “horrid” or “stubborn.” But which? And what of the difference?
David Wootton’s brilliant book, The Invention of Science, makes a compelling case that modernity began with the scientific revolution in Europe. In Wootton’s view, this was “the most important transformation in human history” since the Neolithic era and was in no small measure predicated upon a scientific mindset, which included the unprejudiced observation of nature, careful data collection, and rigorous experimentation. The “scientific way of thinking has become so much part of our culture that it has now become difficult to think our way back into a world where people did not speak of facts, hypotheses and theories, where knowledge was not grounded in evidence, where nature did not have laws.”
As Wootton explains, knowledge, as it was espoused in medieval universities and monasteries, was dominated by the ancients, the likes of Ptolemy, Galen, and Aristotle. It was widely believed that the most important knowledge was already known. Thus, learning was predominantly a backward-facing pursuit, about returning to ancient first principles, not pushing into the unknown.
Brilliant and important as he was, Aristotle posited, for example, that heavy objects fall faster than lighter objects and that males and females have different numbers of teeth, based upon some careful – though erroneous – reasoning. But it never seemed to have occurred to him that he ought to check.
It took Galileo doing the checking centuries later that Aristotle’s errors were thoroughly refuted. Checking and then re-checking our ideas or work offers evidence that may tend to confirm or disprove them. Testing can also be reproduced by any skeptic, which means that you needn’t simply trust the proponent of any idea. You don’t need to take anyone’s word for things – you can check them out for yourself. That is the essence of the scientific endeavor.
To be sure, there remain historians of science (if not actual scientists) who argue in favor of a sort of relativism with respect to science. However, as Wootton suggests, one may maintain (as here) that there is no crucial difference between the Romans’ view that garlic disempowered magnets and current scientific understanding, but the garlic I can hold next to a magnet firmly “gripping” a paper clip demonstrates otherwise. This demonstration is a “killer fact,” a decisive bit of data (per John Adams, a “stubborn” fact) that, in Wootton’s words, “absolutely forces people to change their minds.”
Ponder a bit that, before the scientific revolution, the concept of truth came from the “top down” and was based upon authority, such as that of Aristotle or Rome. The great innovation of science was that it built its concept of truth from the “bottom up,” with its principles discovered rather than pronounced.
Frank Rich is emblematic of the pre-scientific past, when the concept of truth founded upon fact hadn’t yet been invented, or of the post-moderns, for whom the concept of truth is troublesome at best, an oxymoron at worst. For him, facts are “horrid” – obstacles to be overcome or ignored when spinning a favored narrative. Instead, we should aspire to be more like Adams, for whom facts were “stubborn” – realities that need accounting for if we are to obtain the best available approximation of the truth.
But that isn’t our usual modus operendi. Because stories are so powerful, we want the facts to be neatly packaged into a compelling narrative. Take a look at John Boswell‘s delightful send-up of this technique in the TED context below.
We crave “wonder, insight [and] ideas.” Facts? Not so much. As Evgeny Morozov put it:
“Today TED is an insatiable kingpin of international meme laundering – a place where ideas, regardless of their quality, go to seek celebrity, to live in the form of videos, tweets, and now e-books. In the world of TED – or, to use their argot, in the TED ‘ecosystem’ – books become talks, talks become memes, memes become projects, projects become talks, talks become books – and so it goes ad infinitum in the sizzling Stakhanovite cycle of memetics, until any shade of depth or nuance disappears into the virtual void. Richard Dawkins, the father of memetics, should be very proud. Perhaps he can explain how ‘ideas worth spreading’ become ‘ideas no footnotes can support.’”
To be clear, I am entirely and wholeheartedly in favor of using narrative to illustrate concepts. I am also in favor of making difficult concepts, and science in particular, more accessible. Moreover, there are many TED-talks I find inspiring, illuminating, and useful, including some mocked in the video above. However, the issue and the danger is in the “horrid” facts, forcing “stubborn” facts, what Morozov calls “messy reality,” into a glib narrative in ways that simply do not fit or fit quite so neatly (see Lehrer, Jonah).
We are so susceptible to this problem (not to mention our overarching bias blindness generally) that we fall prey to it often and often don’t even recognize it. Indeed, Snopes would not exist without the narrative fallacy. Augustine had it right more than a thousand years before Descartes: Fallor ergo sum (“I err therefore I am”).
Artists looking to recreate and illuminate history – from Shakespeare’s brilliant recreation of the rise of Prince Hal, who would go on to become King Henry V, the “good Christian King” most famous (thanks in no small part to Shakespeare) for his improbable victory over France at the Battle of Agincourt in Henry IV, Part 1, to Lin-Manuel Miranda’s astonishing retelling of America’s founding in Hamilton – have a particular burden here. The artistic impulse, (in a bit of delicious irony, this idea is often falsely attributed to Twain), is never to let the facts get in the way of a good story.
Significantly, Hamilton is consistently wise to this problem in that the characters (and George Washington especially) are aware that who tells the story is crucial to how one is perceived and remembered.
As Hamilton approaches death, he contemplates the issue.
“Legacy. What is a legacy?
It’s planting seeds in a garden you never get to see.
I wrote some notes at the beginning of a song someone will sing for me.
America, you great unfinished symphony, you sent for me.”
When using narrative, the goal should be to avoid inaccuracy while illuminating existential truth. For art to be good, that combination may require taking certain liberties in telling the story. But that always risks telling a distorted (if perhaps “better”) story than the facts allow, or perhaps even a wholly different story.
We like to think that we are like judges, that we carefully gather and evaluate facts and data before coming to an objective and well-founded conclusion. Instead, we are much more like lawyers (and Frank Rich), grasping for any scrap of purported evidence we can exploit to support our preconceived notions and allegiances. Doing so is a common cognitive shortcut such that “truthiness” – “truth that comes from the gut” per Stephen Colbert – seems more useful than actual, verifiable fact. What really matters is that which “seems like truth – the truth we want to exist.” That’s because, as Colbert explained, “the facts can change, but my opinion will never change, no matter what the facts are.”
We shouldn’t be surprised, then, that when their favored narratives were falsified by the actual people in the photograph, Rich and Hoepker offered responses, such as they were, that offered little insight but, instead, heaping piles of vigorous self-justification. Need I add that they didn’t change their minds or their stories either? That they didn’t is further evidence (as if more were necessary) of what Yale’s Dan Kahan calls “identify-protective cognition.”
Ironically, the “truthiness” concept even “became a lexical prize jewel” for Rich himself (see here, for example), allowing him (of course) to criticize his political opponents for offering only “a thick fog of truthiness” such that they presented “a bogus alternative reality so relentless it can overwhelm any haphazard journalistic stabs at puncturing it.” Rich expounded on the idea a number of times in the press and on The Oprah Winfrey Show. He even wrote a book about it. Naturally, he always directs the analysis outward rather than inward.
Rich, isn’t it?2
Writing op-eds for The New York Times placed Rich in a heady and exclusive echo chamber. Yet it is an echo chamber nonetheless, not to mention a target-rich environment for confirmation bias. Keeping one’s analysis and interpretation of the facts of a story reasonably objective – since analysis and interpretation are required for data to be actionable – is really, really hard in the best of circumstances, even when one has gotten the facts right or close thereto.
Megan McArdle sums things up nicely.
“We like studies and facts that confirm what we already believe, especially when what we believe is that we are nicer, smarter and more rational than other people. We especially like to hear that when we are engaged in some sort of bruising contest with those wicked troglodytes – say, for political and cultural control of the country we both inhabit. When we are presented with what seems to be evidence for these propositions, we don’t tend to investigate it too closely. The temptation is common to all political persuasions, and it requires a constant mustering of will to resist it.”
Frank Rich, I’m looking at you. But I’m trying to look at myself too.
Once we have bought-in to a particular narrative, it becomes increasingly more difficult to falsify, even (especially!) when presented with contradicting fact. The more we repeat and reiterate our explanatory narratives, the harder it is to look for “killer” facts that might question or even overturn our preconceived notions and allegiances and to recognize evidence that ought to cause us to re-evaluate our prior conclusions. The facts become less and less “stubborn,” more and more “horrid.”
By making it a careful habit skeptically to re-think our prior interpretations and conclusions, we can at least give ourselves a fighting chance to correct the mistakes that we will inevitably make. As with everything in science, each conclusion we draw must be tentative and subject to revision when the facts so demand. Still, and as ever, for too many people too much of the time, facts and evidence simply do not matter nearly so much as the stories to which we cling.
As John Maynard Keynes is famously said to have said, but probably didn’t, “When the facts change [when they are ‘stubborn’], I change my mind. What do you do, sir?”
Indeed, what do you do?
2 In the interest of demonstrating my nonpartisan intent, I should note that President Trump has his own favored and wholly fictional 9.11 narrative: “Hey, I watched when the World Trade Center came tumbling down. And I watched in Jersey City, New Jersey, where thousands and thousands of people were cheering as that building was coming down. Thousands of people were cheering.”