Two words. Sixteen years. Powerful emotions. Searing memories. Evocative stories.
Sixteen years ago, on Tuesday, September 11, 2001, I was sitting in front of a Bloomberg terminal when the first, cryptic hints about trouble at the World Trade Center crawled across the bottom of my screens (I think). I had been scheduled to fly to New York the day before and had reservations at the Marriott World Trade Center (3 WTC), which would be destroyed when the Twin Towers collapsed. Instead, I decided to stay home and go to a “Back to School Night” presentation at my kids’ school. As the day’s events unfolded, I recalled having been on the Merrill Lynch fixed income trading floor at the World Financial Center doing a STRIPS trade when I heard and felt the February 26, 1993 World Trade Center bombing. I was really glad I didn’t get on that plane to New York.
My little, not so evocative story is insignificant within the context of the tragic losses, horrible evil and incredible heroism of the “American epic” to which that day bore inexorable witness. But it is what happened to me. It provides context and a framing device to help me remember and think about what transpired and what it means. It is emotional to think about still. But many other stories are far more important.
The image reproduced below is central to several other converging stories from that dreadful day.
This photograph, taken at the Brooklyn waterfront in the afternoon by German photographer Thomas Hoepker, is now one of the iconic images of September 11. In fact, the Observer New Review (London) republished it in 2011 as the 9.11 photograph. It shows a group of people sitting in the sun in a park in Brooklyn while, in the background, a cloud of dust and smoke rises above lower Manhattan from the place where two hijacked airliners crashed into the Twin Towers that same morning and caused them to collapse, killing nearly 3,000 people.
In Hoepker’s words, he saw “an almost idyllic scene near a restaurant — flowers, cypress trees, a group of young people sitting in the bright sunshine of this splendid late summer day while the dark, thick plume of smoke was rising in the background.” By his reckoning, even though he had paused for but a moment and didn’t speak to anyone shown in the picture, Hoepker was concerned that the people in the photo “were not stirred” by the events at the World Trade Center – they “didn’t seem to care.” Hoepker published many images from that day, but he withheld this picture for over four years because, in his view, it “did not reflect at all what had transpired.” Still, in 2005, he chose it as the catalogue cover for his 9.11 retrospective in Munich.
In 2006, the image was published here in the States in David Friend’s book, Watching the World Change (as shown above). As Friend wrote in his book, “it did not meet any of our standard expectations of what a September 11th photograph should look like” (emphasis in original). As reported by The Wall Street Journal, Hoepker then began comparing his photo to Breugel’s “Landscape with the Fall of Icarus,” a Renaissance painting in which a peasant continues plowing, seemingly unaware or unconcerned by the body of a boy plummeting from the sky into the sea (see below).
Later that year, Frank Rich wrote a 9.11 fifth anniversary column for The New York Times, framed by the Hoepker photograph, which he called “shocking.” Rich claimed that the five New Yorkers shown were “relaxing” and were already “mov[ing] on” from the attacks. Rich described them as being on “what seems to be a lunch or bike-riding break, enjoying the radiant late-summer sun and chatting away as cascades of smoke engulf Lower Manhattan in the background.” The Rich explanatory narrative then draws a remarkable conclusion, one that isn’t nuanced or qualified in the slightest.
“Mr. Hoepker’s photo is prescient as well as important — a snapshot of history soon to come. What he caught was this: Traumatic as the attack on America was, 9/11 would recede quickly for many. This is a country that likes to move on, and fast. The young people in Hoepker’s photo aren’t necessarily callous. They’re just American.”
In Rich’s view, America’s desire quickly to move on from this tragedy, foreshadowed by Hoepker’s image, meant we didn’t learn any lasting lessons from it. It was a plausible – if utterly speculative – interpretation based upon the image alone, but wasn’t supported by any substantive evidence. However, and more importantly, it framed Rich’s desired narrative perfectly. Still, even though a picture may well be worth a thousand words (1,506 in this case, to be exact), those words needn’t be accurate. Photographs necessarily capture just a moment in time and, as such, they need context and narrative to be understood, even if the understanding is utterly false. Moreover, unlike other artistic constructs, this photograph has an actual context and shows real people in real life. It can be (and, in this case, was) interpreted erroneously.
Daniel Plotz quickly came forward with an alternative interpretation that disputed Rich, calling Rich’s reading of the image a “cheap shot.” In Plotz’s view, the five people in the picture had not ignored or moved beyond 9.11 but had “turned toward each other for solace and for debate.” To his credit, Plotz emphasized that he didn’t “really know” what the pictured people were doing and feeling and called upon them to contact him so as to set the record straight. Two did, and they repudiated Rich’s narrative in the strongest of terms.
The first to respond was Walter Sipser, a Brooklyn artist and the man on the far right in the shot. “A snapshot can make mourners attending a funeral look like they’re having a party,” he wrote. “Had Hoepker walked fifty feet over to introduce himself he would have discovered a bunch of New Yorkers in the middle of an animated discussion about what had just happened.”
Chris Schiavo, a professional photographer, Sipser’s then-girlfriend and second from the right above, also responded. She criticized both Rich and Hoepker for their “cynical expression of an assumed reality.” As a “third-generation native New Yorker, who knows and loves every square inch of this city,” whose “mother even worked for Minoru Yamasaki, the World Trade Center architect,” she stated that “it was genetically impossible for [her] to be unaffected by this event.”
Not only were Sipser and Schiavo turned into “national symbol[s] of moral disgrace” by Frank Rich, they suffered that fate under false pretenses. Hoepker and Rich, as we are all prone to do, interpreted the picture for their own purposes with a false certainty that neither the evidence nor common decency required or even strongly supported. Their desired narratives simply carried the day, facts notwithstanding.
Narratives need heroes, villains, protagonists, character arcs, redemption, vindication and meaning, all of which can overshadow or obscure what is unequivocally real, like facts and data. Perhaps worse, whether you like it or not, stories are really, really powerful. An oft-cited Ohio State study found that a message in story form is up to 22 times more memorable than disconnected facts alone. As explained by mathematician John Allen Paulos, “There is a tension between stories and statistics, and one under-appreciated contrast between them is simply the mindset with which we approach them. In listening to stories we tend to suspend disbelief in order to be entertained, whereas in evaluating statistics we generally have an opposite inclination to suspend belief in order not to be beguiled.”
In his fine book, A Wealth of Common Sense, Ben Carlson uses the wonderful example of The Significant Objects Project, in which researchers were able to sell worthless baubles on eBay for surprising amounts when they linked each one with a compelling story. “The problem with narratives is that they often fail as investment ideas,” he writes. “This is because those ideas are usually baked into the price.”
We love stories, true or not, almost from the cradle. Stories are crucial to how we make sense of reality. They help us to explain, understand and interpret the world around us. They also give us a frame of reference we can use to remember the concepts we take them to represent. Whether measured by my grandchildren begging for one (or “just one more”), the book industry, data visualization, television, journalism (which reports “stories”), the movies, the parables of Jesus, video games, or even country music (“every song tells” one), story is perhaps the overarching human experience. It’s how we think and respond. We always want (and think we need) to know what happens next.
Stories are culture’s way of teaching us what is important. They are what allow us to imagine what might happen next – and beyond – so as to prepare for it. We are hardwired to respond to story such that a good story doesn’t feel like a story – it feels exactly like real life, but most decidedly is not like real life. It is heightened, simplified and edited. We prefer rhetorical grace and an emotional charge to the work of hard thought. Because we are inveterate simplifiers, we prefer clean and clear narrative to messy reality. A famous book by Karl Popper, The Poverty of Historicism, pretty well demolished the popular notion that history was a narrative, that it had a shape, a necessary progression, and followed laws of development. But we believe that it does (or devoutly wish to believe that it does) anyway.
Still, because it feels so true (“It can’t be wrong when it feels so right”), it isn’t hyperbole to say you’ve been lost in a story. Story turns us into willing students, eager to learn the story’s message. It’s how we sift through the raw data of our lives to ascertain what matters. Our brains are designed to analyze the environment, pick out the important parts, and use those bits to extrapolate linearly and simplistically about and into the future. We may learn facts and data but we feel stories, and that makes all the difference.
Ultimately, the key to a good story isn’t just what happens or to whom it happens. As Roger Ebert so eloquently put it, broadened ever so slightly, a story “is not about what it’s about; it’s about how it’s about it.” Stories are about how the protagonist changes and how we react to those changes and ourselves change. We can “see” the world as it isn’t (yet) but as it might become.
The best stories are simple, easily communicated, easily grasped and easily remembered. Perhaps most significantly, we inherently prefer narrative to data – often to the detriment of our understanding. To do math, neither maturity nor knowledge of human nature and experience are required. All that is needed is the ability to perceive patterns, logical rules and linkages. But because of the enormous sets of random variables involved in real life, patterns, logical rules and linkages alone do not solve many actual puzzles. Correlation does not imply causation. Information may be cheap but meaning is expensive and elusive. Insight is priceless.
As Nassim Taleb thoughtfully explains in The Black Swan, the narrative fallacy addresses our limited ability to look at sequences of facts without weaving an (often erroneous) explanation into them or, equivalently, forcing a logical link, an arrow of relationship upon them. Explanations bind facts together. They make them all the more easily remembered; they help them make more sense. Where this propensity often goes wrong is when it increases our impression of understanding. Or when we have the facts wrong.
Which brings us back to Frank Rich.
Five years after the towers came down, Frank Rich had a story to tell. It was a story of a “divided and dispirited” America that had lost touch with the horror of 9.11, of a forgetful nation desperate to move on, a divided nation insufficiently stirred. It was also and crucially the story of a callow, fear-mongering President with a selfish and secret partisan agenda far removed from committed sacrifice for the common good. It was a story of a once-great country that had moved on but not ahead. And he thought he had found the perfect picture to illustrate that story.
As a journalist, Rich had an obligation to check the facts of his story. By all appearances, he did not. Perhaps he thought it was “too good to check.” If so, he was dreadfully and blatantly wrong. Anyone practicing rudimentary journalism would have at least made an effort to contact the people involved before purporting to know their thoughts, feelings and motives. Perhaps he tried and was unsuccessful or decided that Hoepker’s description was enough to go on. If so, he didn’t try hard enough and also had an obligation to be forthright about what he knew and what was mere speculation. That he did not was an egregious error, an error that would make him look silly when the truth came out, as it so often does.
Many of our foibles (narrative and otherwise) are the result of our laziness. Sometimes the laziness is overt. Other times it is simply a function of the various shortcuts we take, sometimes reasonably, to make life more manageable. Rich took a variety of shortcuts in writing his story, shortcuts that perverted the truth of what the Hoepker photograph actually portrayed. His “horrid” facts, necessary for his purported conclusion, were wrong, plain and simple.
Based on Jane Austen’s epistolary novel, Lady Susan, eventually published more than 50 years after her death, Love and Friendship is a terrific film. This witty black comedy of manners and machinations feels like it has more in common with Oscar Wilde than the author of Sense and Sensibility. “Facts are horrid things,” Lady Susan (played by the wonderful Kate Beckinsale) shrugs in one scene (directly following the book), dismissing culpability for her latest scandalous actions, which included sleeping with another woman’s husband. Hearing Lady Susan spinning her having been unmasked as a duplicitous scoundrel as everyone’s shame but her own is a masterful and, in context, hysterical tour de force. At least she didn’t read other people’s correspondence. Mere “horrid” facts will not deter her.
By comparison, consider the man who would become our second president. When making his defense of British soldiers during the Boston Massacre trials in December of 1770, John Adams (played by the excellent Paul Giamatti in the clip below) offered a now famous insight: “Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of facts and evidence.” Legal Papers of John Adams, 3:269. In a similar vein, Sen. Daniel Patrick Moynihan once said that “[e]veryone is entitled to his own opinion, but not to his own facts.” Adams’ speech is recreated below in the fine John Adams miniseries from HBO based on David McCullough’s Pulitzer Prize-winning biography.
Facts, then, can be either “horrid” or “stubborn.” But which? And what of the difference?
David Wootton’s brilliant book, The Invention of Science, makes a compelling case that modernity began with the scientific revolution in Europe, book-ended by Danish astronomer Tycho Brahe’s identification of a new star in the heavens in 1572, which proved that heavens weren’t fixed, and the publication of Isaac Newton’s Opticks in 1704, which drew conclusions based upon experimentation. In Wootton’s view, this was “the most important transformation in human history” since the Neolithic era and in no small measure predicated upon a scientific mindset, which includes the unprejudiced observation of nature, careful data collection, and rigorous experimentation. In his view, the “scientific way of thinking has become so much part of our culture that it has now become difficult to think our way back into a world where people did not speak of facts, hypotheses and theories, where knowledge was not grounded in evidence, where nature did not have laws.”
The scientific approach was truly a new way of thinking (despite historical antecedents). Wootton shows that when Christopher Columbus came to the New World in 1492, he didn’t have a word to describe what he had done (or at least appeared to have done, with apologies to the Vikings). It was the Portuguese, the first global imperial power, who introduced the term “discovery” in the early 16th Century. There were other new words and concepts that were also important when trying to understand the scientific revolution, such as “fact” (only widely used after 1663), “evidence” (incorporated into science from the legal system) and “experiment.”
As Wootton explains, knowledge, as it was espoused in medieval universities and monasteries, was dominated by the ancients, the likes of Ptolemy, Galen, and Aristotle. Accordingly, it was widely believed that all of the most important knowledge was already known. Thus learning was predominantly a backward-facing pursuit, about returning to ancient first principles, not pushing into the unknown. Indeed, Wootton details the emergence of fact and evidence as previously unknown terms of art. The modern scientific pursuit is the “formation of a critical community capable of assessing discoveries and replicating results.”
In its broadest context, science is the careful, systematic and logical search for knowledge, obtained by examination of the best available evidence and always subject to correction and improvement upon the discovery of better or additional evidence. That is the essence of what has come to be known as the scientific method, which is the process by which we, collectively and over time, endeavor to construct an accurate (that is, reliable, consistent and non-arbitrary) representation of the world. Otherwise (per James Randi), we’re doing magic, and magic simply doesn’t work.
Aristotle, brilliant and important as he was, posited, for example, that heavy objects fall faster than lighter objects and that males and females have different numbers of teeth, based upon some careful – though flawed – reasoning. But it never seemed to have occurred to him that he ought to check. It took Galileo doing the checking centuries later that Aristotle’s errors were thoroughly refuted. Checking and then re-checking our ideas or work offers evidence that may tend to confirm or disprove them. By collecting “a long-term data set,” per field biologist George Schaller, “you find out what actually happens.” Testing can also be reproduced by any skeptic, which means that you needn’t simply trust the proponent of any idea. You don’t need to take anyone’s word for things — you can check it out for yourself. That is the essence of the scientific endeavor.
To be sure, there remain historians of science (if not so many actual scientists) that argue in favor of a sort of relativism with respect to science. For example, in Leviathan and the Air-Pump, authors Shapin and Schaffer assert that the success of experimental science depends upon its proponents’ “political success…. He who has the most, and the most powerful, allies wins” (evidence be damned). As Wootton suggests in an exchange with Philip Ball, the fact that Daryn Lehoux’s What did the Romans Know? maintains that there is no crucial difference between the Romans’ view that garlic disempowered magnets and current scientific understanding, the garlic I am holding next to a magnet firmly “gripping” a paper clip demonstrates otherwise. This demonstration is a “killer fact,” a decisive bit of data (a “stubborn” fact) that, in Wootton’s words, “absolutely forces people to change their minds.”
Ponder a bit that, before the scientific revolution, the concept of truth came from the “top down” and was based upon authority, such as that of Aristotle or the Church. The great innovation of science was that it built its concept of truth from the “bottom up,” with its principles discovered rather than pronounced.
That innovation in inherently limiting, however. We want deductive proof in the manner of Aristotle, but have to settle for induction. That’s because science can never fully prove anything. It analyzes the available data and, when the force of the data is strong enough, it makes tentative conclusions. But these conclusions are always subject to modification or even outright rejection based upon further evidence gathering. The great value of facts and data is not so much that they point toward the correct conclusion (even though they do), but that they allow us the ability to show that some things are conclusively wrong.
Scientific progress comes not via verification (which can only be inferred) but by falsification (which, if established and itself verified, provides relative certainty only as to what is not true). Thank you, Karl Popper. Accordingly, in the investment world, as in science generally, we need to build our investment processes from the ground up, with hypotheses offered only after a careful analysis of all relevant facts and tentatively held only to the extent the facts and data allow. Crucially, we need always to be on the look-out for disconfirming evidence, even though doing so is counterintuitive pretty much all the time.
Frank Rich is emblematic of the pre-scientific past, when the concept of truth founded upon fact hadn’t yet been invented, or of the postmoderns, for whom the concept of truth is troublesome at best, an oxymoron at worst. For him, facts are “horrid” – obstacles to be overcome or ignored when spinning a favored narrative. Instead, we should aspire to be more like Adams, for whom facts were “stubborn” – realities that need accounting for if we are to obtain the best available approximation of the truth.
But that isn’t our usual modus operendi. Because stories are so powerful, we want the facts to be neatly packaged into a compelling narrative. Take a look at John Boswell‘s delightful send-up of this technique in the TED context below.
We crave “wonder, insight [and] ideas.” Facts? Not so much. As Evgeny Morozov puts it:
“Today TED is an insatiable kingpin of international meme laundering — a place where ideas, regardless of their quality, go to seek celebrity, to live in the form of videos, tweets, and now e-books. In the world of TED — or, to use their argot, in the TED ‘ecosystem’ — books become talks, talks become memes, memes become projects, projects become talks, talks become books — and so it goes ad infinitum in the sizzling Stakhanovite cycle of memetics, until any shade of depth or nuance disappears into the virtual void. Richard Dawkins, the father of memetics, should be very proud. Perhaps he can explain how “ideas worth spreading” become “ideas no footnotes can support.”
Felix Salmon’s excellent discussion of this argument in the context of Jonah Lehrer’s sad case (interestingly put into context here), which decries the use of “remixed facts in service of narrative,” establishes clearly (if unsurprisingly) that the facts are frequently too stubborn to fit neatly into a narrative-driven format, whether TED talk, blog post or bestseller. According to Seth Mnookin and reiterated by Salmon, “Lehrer had ‘the arrogance to believe that he has the right to rejigger reality to make things a little punchier, or a little neater.’” Felix perhaps goes beyond Morozov to argue “that TED-think isn’t merely vapid, it’s downright dangerous in the way that it devalues intellectual rigor at the expense of tricksy emotional and narrative devices.”
To be clear, I am entirely and wholeheartedly in favor of using narrative to illustrate concepts. I am also in favor of making difficult concepts, and science in particular, more accessible. Moreover, there are many TED-talks I find inspiring, illuminating and useful, including some mocked in the video above. However, the issue and the danger is in the “horrid” facts, forcing “stubborn” facts, what Morozov calls “messy reality,” into a glib narrative in ways that simply do not fit.
We are so susceptible to this problem (not to mention our overarching bias blindness generally) that we fall prey to it often and often don’t recognize it. Indeed, Snopes would not exist without our propensity for not letting facts get in the way of a good story (in a bit of delicious irony, this idea is often falsely attributed to Mark Twain). Augustine had it right more than a thousand years before Descartes: Fallor ergo sum (“I err therefore I am”).
Artists looking to recreate and illuminate history – from Lin-Manuel Miranda’s astonishing retelling of America’s founding in Hamilton to Shakespeare’s brilliant recreation of the rise of Prince Hal, who would go on to become King Henry V, the “good Christian King” most famous (thanks in no small part to Shakespeare) for his improbable victory over France at the Battle of Agincourt in Henry IV, Part 1 – have a particular burden here. The artistic impulse, per Mark Twain, is never to let the facts get in the way of a good story.
Ironically, Hamilton is consistently aware of this dilemma in that the characters (George Washington especially) are always aware that who tells the story is crucial to how one is perceived and remembered. As Hamilton approaches death, he contemplates the problem.
“Legacy. What is a legacy?
It’s planting seeds in a garden you never get to see
I wrote some notes at the beginning of a song someone will sing for me
America, you great unfinished symphony, you sent for me”
The artistic goal should be to avoid inaccuracy and to illuminate existential truth, which may require taking certain liberties in telling the story. But that always risks telling a different (if perhaps better) story than the facts allow or even a wholly different story.
We like to think that we are like judges, that we carefully gather and evaluate facts and data before coming to an objective and well-founded conclusion. Instead, we cut straight to the chase. We are much more like lawyers (and Frank Rich), grasping for any scrap of purported evidence we can exploit to support our preconceived notions and allegiances. Doing so is a common cognitive shortcut such that “truthiness” – “truth that comes from the gut” per Stephen Colbert – seems more useful than actual, verifiable fact. What really matters is that which “seems like truth – the truth we want to exist.” That’s because, as Colbert puts it, “the facts can change, but my opinion will never change, no matter what the facts are.” We shouldn’t be surprised, then, that when their favored narratives were falsified by the actual people in the photograph, Rich and Hoepker offered responses, such as they were, that offered little insight but heaping piles of vigorous self-justification. Need I add that they didn’t change their minds or their stories either? That they didn’t is further evidence (as if more were necessary) of what Yale’s Dan Kahan calls “identify-protective cognition.”
The “truthiness” concept even “became a lexical prize jewel” for Rich himself (see here, for example), allowing him (of course) to criticize his political opponents for offering only “a thick fog of truthiness” such that they presented “a bogus alternative reality so relentless it can overwhelm any haphazard journalistic stabs at puncturing it.” Rich expounded on the idea a number of times in print and on The Oprah Winfrey Show. He even wrote a book about it. Of course, he always directs the analysis outward rather than inward.
Rich, isn’t it?
“Truthiness” captures how, as cognitive psychologist Eryn Newman explains, “smart, sophisticated people” can go astray on matters of fact. Newman’s research has shown that the less effort it takes to process a factual claim, the more accurate it seems. In one classic study, for example, people were more likely to think a statement was true when it was written in high color contrast as opposed to low contrast. Easy-to-pronounce ticker symbols (such as KAR) perform better in the markets than their difficult-to-pronounce counterparts (such as RDO), even after just one day of trading. And, astonishingly, claims attributed to people with easy-to-pronounce names were deemed more credible than those attributed to people with difficult-to-pronounce names. As summarized by Slate: “When we fluidly and frictionlessly absorb a piece of information, one that perhaps snaps neatly onto our existing belief structures, we are filled with a sense of comfort, familiarity, and trust. The information strikes us as credible, and we are more likely to affirm it — whether or not we should.”
Due to our affinity for like-minded people, we seek out people like us to provide echo chambers for our own claims, claims that perpetuate themselves every time we hear them reverberated back to us. Consider what your Facebook and Twitter feeds look like, especially during election season, for example. We are neuro-chemically confirmation bias addicts. As such, we tend to reach our conclusions first. Only thereafter do we gather purported facts and, even then, see those facts in such a way as to support our pre-conceived conclusions. When it fits with our desired narrative, so much the better.
Writing op-eds for The New York Times provided Rich with a heady and exclusive echo chamber, but an echo chamber nonetheless. Keeping one’s analysis and interpretation of the facts of a story reasonably objective — since analysis and interpretation are required for data to be actionable — is really, really hard in the best of circumstances, even when one has gotten the facts right or close thereto.
Megan McArdle sums things up nicely.
“We like studies and facts that confirm what we already believe, especially when what we believe is that we are nicer, smarter and more rational than other people. We especially like to hear that when we are engaged in some sort of bruising contest with those wicked troglodytes — say, for political and cultural control of the country we both inhabit. When we are presented with what seems to be evidence for these propositions, we don’t tend to investigate it too closely. The temptation is common to all political persuasions, and it requires a constant mustering of will to resist it.”
Frank Rich, I’m looking at you. But I’m trying to look at myself too.
Once we have bought-in to a particular narrative, it becomes increasingly more difficult to falsify, even (especially!) when presented with contradicting fact. Take the example of parents who choose not to vaccinate their children and the pediatricians who try to convince them otherwise. When presented with unequivocal evidence that autism and vaccinations are not linked, the strategy backfired and parents became more set in their ignorance. In other words, the disconfirming facts offered essentially (and in effect) turned up the volume inside the echo chamber such that the truth could not be heard.
The more we repeat and reiterate our explanatory narratives, the harder it is to look for “killer” facts that might question or even overturn our preconceived notions and allegiances and to recognize evidence that ought to cause us to re-evaluate our prior conclusions. The facts become less and less “stubborn,” more and more “horrid.” By making it a careful habit skeptically to re-think our prior interpretations and conclusions, we can at least give ourselves a fighting chance to correct the mistakes that we will inevitably make. As with everything in science, each conclusion we draw must be tentative and subject to revision when the facts so demand. Still, and as ever, for too many people too much of the time, facts and evidence simply do not matter. As John Maynard Keynes is famously said to have said, but probably didn’t, “When the facts change [when they are ‘stubborn’], I change my mind. What do you do, sir?”
Indeed, what do you do?
Note: This post is a re-worked version of prior posts, here, for example.