We all suffer the ongoing delusion that we’re the center of the universe. We focus first and foremost upon ourselves and tend, most often, to focus on everything and everyone else only as they relate to us, both literally and figuratively.
The great writer David Foster Wallace spoke eloquently, movingly even, about this egocentric delusion in a fantastic commencement address he delivered at Kenyon College in 2005 in a way that just might help to loosen the hold of this delusion on those of us able to hear what he had to say. The speech is delightfully summarized and excerpted in the short film from The Glossary embedded below. Please take the time to watch it in full. Please. The voice you hear is Wallace’s.
According to Wallace, if I don’t make a conscious decision about how to think and what to pay attention to, I’m going to be miserable a lot of the time. That’s because our natural default setting is that everything is self-centered. And that everybody else is in my way. And it isn’t even a choice (it’s a default setting).
I’ll even take Wallace’s point a step further. If you are automatically sure you know what reality looks like and what is truly important, you will miss out on outrageous opportunities to learn, grow and be a blessing to others. If we have learned how to think and to pay attention we can know that we have other options. It’s a way to redeem our time. That’s the freedom offered by real education – the option to choose how we’re going to see life with the added bonus that we just might see things a bit more clearly.
Much religious thought is authoritarian – telling us what to believe and how to act. The choosing begins and ends with the chosen observance. Wallace refutes that. Modern scientific thought gets to the same end-point via other means. It claims that we have no opportunity for choosing because of the brute determinism by which cause and effect relentlessly and remorselessly govern not only our lives, but the entire universe. Wallace refutes that too. If and when we choose to live differently, to overcome the defaults which poison our thinking, we can do better and make things better.
For Wallace, “The only thing that’s capital-T True is that you get to decide how you’re going to try to see it. You get to consciously decide what has meaning and what doesn’t. You get to decide what to worship.…” Scientism doesn’t even allow us the choosing. Make no mistake. Here, as elsewhere, Bob Dylan was right (just like David Foster Wallace). We all Gotta Serve Somebody. In Wallace’s words, “There is no such thing as not worshipping. Everybody worships. The only choice we get is what we worship.”
It is delightfully counterintuitive to think that mere choosing is such a subversive and powerful act. Yet like so much of life, all of this is obvious in retrospect but painfully difficult to do even once in a great while. Moreover, it isn’t just our default settings that conspire against us.
Amazon has long been the best place to find the books you’re looking for. It’s simple and easy to use. It has even been improved such that when you look at or buy a particular book, you get instant recommendations of similar books. People who bought x also bought y.
In our increasingly data-soaked and algorithm-dominated world, our tastes and preferences are increasingly known, catered to and carefully influenced. It’s like the vendor who knows what you like and keeps selling it to you over and over again with only enough variation to justify the label of “new.”
The best bricks and mortar bookstores challenge our default settings and easy assumptions. Instead of recommending Annie Dillard because we’re fond of John McPhee, pointing out John le Carré if we like Ruth Rendell or E.J. Dionne because we bought Frank Rich, worthwhile and justifiable as those foisted choices might be, the best bookstores (and those who run them) point us to new authors with new ideas and approaches. One crucial delight of a good bookstore is being able to find what we’re not looking for…and having it turn out great. But of course we need to try these new books and try-on these new ideas.
In 2006, Publishers Weekly called The Spy Who Came in from the Cold “the best spy novel of all time,” 43 years after its publication. As David Denby pointed out in The New Yorker, John le Carré’s international Cold War best-seller “is fiendishly clever, as Arthur Conan Doyle might have said, and morally alert in a way that puts it way above the usual run of espionage fiction.” The author made the startling (for its time) decision to portray the intelligence methods of both Western and Communist countries as vile and morally senseless. Significantly, the plot depends on a series of reversals—as you read, you have to keep revising your understanding of what’s going on. That’s much of the fun of such books, of course, but much of the difficulty of real life. We need to keep revising our understanding of what’s going on but…(you know the drill).
So while Amazon is unparalleled at allowing us to find what we’re looking for, if we are going to do better and be better we’re going to need to find what we’re not looking for. But that’s fiendishly difficult. As I have argued repeatedly, while information is cheap and getting cheaper, meaning is increasingly expensive. We are beset by confirmation bias, our tendency to look for and accept evidence that supports what we already think we know and ignore the rest. Per motivated reasoning, we tend to reject new evidence when it contradicts our established beliefs. Sadly (and counter-intuitively) the research is clear that the smarter we are, the more likely we are to deny or even oppose data that seem in conflict with ideas we deem important.
Wallace cautions against a kind of selfish, arrogant intellectual dogmatism that takes the form of “blind certainty, a close-mindedness that amounts to an imprisonment so total that the prisoner doesn’t even know he’s locked up.” But he emphasizes that this endeavor “is not a matter of virtue—it’s a matter of my choosing to do the work of somehow altering or getting free of my natural hard-wired default setting, which is to be deeply and literally self-centered, and to see and interpret everything through this lens of self.” It takes constant effort to reframe, contextualize and tolerate reality in order get from information to meaning.
Ironically, all of this is directly at odds with a pervasive university culture that seeks to impose a groupthink against anything different, difficult or uncomfortable, what Jonathan Chait describes as “jeering student mobs expressing incredulity at the idea of political democracy.” These “protests” confirm that an alarming number of students and their professors are unwilling to take up Wallace’s challenge by questioning their assumptions, exercising self-awareness and judgment and rejecting default intolerance and arrogant certainty, even though universities ought to be the best of places for doing so.
The list of things – really important and seemingly well-established and researched things – that the collective “we” have been wrong about despite being sure is a long one. We thought that ulcers were caused by stress until Barry Marshall and Robin Warren showed that the bacteria H. pylori is the actual cause (and won a Nobel Prize for doing so). We were sure of the existence of ether throughout the universe, the medium though which light was thought to travel. But the celebrated Michelson-Morley experiment provided hard evidence that ether did not exist. Pretty much everyone even (wrongly) believes that Walter White was good at chemistry or that cell phone usage while pumping gas can cause an explosion.
We even have myths about myths. Believing that people believed the world was flat is a good example of a modern myth about prior scientific belief. Those who bothered to investigate have known since at least the time of Eratosthenes that the Earth is spherical and even how big it is.
Because we’re wrong so often, diversity of thinking is imperative if we’re going to interpret the available information meaningfully. As Harvard’s Cass Sunstein recently reported via Bloomberg View, a new study focusing on Facebook users provides strong evidence that the reason falsehoods, misstatements and ridiculous conspiracy theories thrive on social media is confirmation bias (again, it’s our tendency to seek out information that confirms our beliefs and to ignore contrary information). Thus Facebook users tend to choose and share stories containing messages they accept and to neglect those with views and conclusions they reject.
Accordingly, if a story fits with what people already believe, they are far more likely to be interested in it and thus to spread it. In other words, we tend to live in echo chambers (per the study, “homogeneous, polarized clusters”) – communities of like-minded people where our own views are reinforced and opposing views, to the extent they are considered at all, are some combination of ignored, denigrated and shouted down. Per the study, “users mostly tend to select and share content according to a specific narrative and to ignore the rest,” with the consequence being the “proliferation of biased narratives fomented by unsubstantiated rumors, mistrust, and paranoia.”
Even worse, as Sunstein points out, is group polarization. When like-minded people communicate, they tend to end up thinking a more extreme version of what they already believed, irrespective of what the data shows. Thus whenever people spread misinformation within these homogenous clusters, they also intensify one another’s commitment to the misinformation. Seeing that other people share your views (no matter how wacky) intensifies your commitment to them and tends to add to the disdain with which you hold those who think otherwise.
We all suffer from these cognitive and behavioral biases that poison any hope for objective analysis, especially about those things that are closest to us and in which we have the most invested. If we are at all aware, we will frequently recognize it in others – especially the most egregious examples. But we will almost never recognize it in ourselves. That’s because everybody else is expressing opinions while we are stating facts, or so it seems. That reality – that failing – is bias blindness. It’s our inability or unwillingness, even if and when we see it in others, to see the biases that beset us. Bias is everywhere. So is bias blindness, no matter how willing – and even eager – we are to deny it. On our best days we might grudgingly concede that we hold views that are wrong. The problem is in providing current examples.
Individuals thus need to broaden their perspectives which, per Wallace, takes constant effort. As Philip Tetlock outlines in his wonderful new book, Superforecasting: The Art and Science of Prediction, good forecasting, like good decision-making generally, requires rigorous empiricism, probabilistic thinking, a recognition that absolute answers are extremely rare, regular reassessment, accountability, and an avoidance of too much precision. More fundamentally, we need more humility and more diversity about and in our information sources and among those contributing to our decisions. We need to be concerned more with process and improving our processes than in outcomes, important though they are. “What you think is much less important than how you think,” says Tetlock. Thus we’re best off if we regard our views “as hypotheses to be tested, not treasures to be guarded.” As he told my friend Jason Zweig of The Wall Street Journal, most of us “are too quick to make up their minds and too slow to change them.”
Most importantly, perhaps, Tetlock encourages us to hunt and to keep hunting for evidence and reasons that might contradict our views and to change our minds as often and as readily as the evidence suggests. One “superforecaster” went so far as to write a software program that sorted his sources of news and opinion by ideology, topic and geographic origin, then told him what to read next in order to get the most-diverse points of view. If you regularly watch one cable channel for news and information, you might want to change the channel at least once in a while.*
The best decision-makers are all curious, humble, self-critical, give weight to multiple perspectives and feel free to change their minds often. In other words, they are not (using Isaiah Berlin’s iconic description, harkening back to Archilochus), “hedgehogs,” who explain the world in terms of one big unified theory, but rather “foxes” which, Tetlock explains, “are skeptical of grand schemes” and “diffident about their own forecasting prowess.”
But as Tim Richards explains (David Foster Wallace too), we are both by design and by culture inclined to be anything but humble in our approach to investing (and everything else). We invest with a certainty that we’ve picked winners and sell in the certainty that we can re-invest our capital to make more money elsewhere. But we are usually wrong, often spectacularly wrong. These tendencies come from hard-wired biases and also from emotional responses to our circumstances. But they also arise out of cultural requirements that demand we show ourselves to be confident and decisive at all times. Even though we should, we rarely reward those who show caution and humility in the face of uncertainty, as our political process so depressingly demonstrates.
David Foster Wallace opened his Kenyon commencement address with a parable of sorts.
There are these two young fish swimming along and they happen to meet an older fish swimming the other way, who nods at them and says “Morning, boys. How’s the water?” And the two young fish swim on for a bit, and then eventually one of them looks over at the other and goes, “What the hell is water?”
This “water” is the reality we immerse ourselves in that we don’t see because of the insidious default settings that prevent us from seeing what the world is really like.
Wallace harkens back to his parable in conclusion. “[T]he real value of a real education…has almost nothing to do with knowledge, and everything to do with simple awareness; awareness of what is so real and essential, so hidden in plain sight all around us, all the time, that we have to keep reminding ourselves over and over:
“‘This is water.’
“‘This is water.'”
We need to find what we’re not looking for. We need to start, per David Foster Wallace, by learning to think and to pay attention…by simple awareness. By trying to look.
__________
* Broader diversity is important too and for many of the same reasons. As James Surowiecki of The New Yorker points out in his examination of the tech industry’s astonishing male-centeredness, “Promoting diversity isn’t, as many techies think, pure do-gooderism. It’s genuinely good for business, since a large body of evidence suggests that making organizations more diverse can also make them perform better.” He notes that while tech companies may believe they are meritocracies, unconscious biases influence their hiring and promotion habits. “Subverting these biases requires more than training. Instead, companies should be looking for . . . ‘bias interrupters’: systems that identify bias and intervene to mitigate it.” Josh Bersin recently completed a two-year research study, High-Impact Talent Management, and the findings are compelling. Among the 120-plus different talent practices examined, those that predict the top performing companies all concentrate on what the researchers describe as an “inclusive talent system.”
Pingback: 01/28/16 – Thursday’s Interest-ing Reads | Compound Interest-ing!
Pingback: Black Monday | Above the Market