On November 24, 1951, Princeton defeated Dartmouth, 13-0, to win its 22nd straight football game and complete a second consecutive undefeated season for what was described, by the great writer John McPhee, as “Phi Beta Football.” In those days, Princeton still used a then-old, direct snap, pure power offense called the single wing even though most college teams were “mating the quarterback to the center of the line in the formation called ‘T.’” It was also the final game for Princeton tailback and legend Dick Kazmaier, the “Maumee Menace,” a future College Football Hall of Fame inductee and McPhee’s roommate. “Kaz” had been pictured on the cover of Time magazine that week (right) and would soon win the Heisman Trophy (the last Ivy League player to do so) in a landslide. But the game that day is not primarily remembered as having capped off an outstanding season and a brilliant career.
Instead, the legacy of that brisk late autumn afternoon contest rests upon two seemingly unrelated matters: allegations of intentionally dirty play by Dartmouth and our inability to perceive reality with any degree of objective accuracy, especially where we have a major emotional investment. Based upon various sources, the primary narrative from the game is that Dartmouth set out to injure Princeton players – particularly Kazmaier – and that after the Princeton star was injured and forced from the game in the second quarter, matters turned increasingly fractious. But that wasn’t the only proffered narrative. Continue reading
As I have often argued, we like to think that we see the world as it truly is. Instead, we tend to see the world as we really are. Sadly, that reality is both literal and figurative.
As Nature has reported, if you take a look at the gif below, you’ll discover some really weird stuff about yourself and your brain.
Source: Prof. Michael Bach, University of Freiburg
Look at any of the yellow dots as the figure moves; it remains present and stationary. If you concentrate on all three yellow dots, they remain in place too. But if you concentrate on the central green dot instead, one or more of the yellow dots will seem to disappear and then reappear intermittently even though they are really there the whole time. Your brain simply doesn’t register their presence sometimes. This optical illusion, called motion-induced blindness, applies to nearly everyone. Continue reading
Over a career that has spanned four decades so far, concertgoers have routinely paid a lot of money to hear Phil Smith play the trumpet. The long-time principal trumpet of the New York Philharmonic retired this summer after 36 years in the orchestra. In his first professional audition, while still a student, he won a place in the Chicago Symphony. While still in his 20s, Phil came to New York following just his second professional audition. According to New Yorker magazine, “For the past thirty-six years, Smith has presided over orchestral trumpet playing, with a resonant, clarion sound and a reputation for never missing a note.” He has been, inarguably, one of the world’s great performers. Continue reading
Noah Smith (@Noahpinion on Twitter) made an interesting assertion yesterday about the purpose of argument. Smith began by noting Boston University economist Laurence Kotlikoff’s op-ed in Forbes in which he acts as a concern troll toward New York Times columnist (and noted economist himself) Paul Krugman because Krugman allegedly called Congressman Paul Ryan stupid. To be clear, Krugman’s primary point was not that Ryan is stupid, but that he is crooked, especially as it pertains to his budget proposals. Smith uses this context for looking at arguments in general, and he makes an excellent point.
[A]s a society, we use arguments the wrong way. We tend to treat arguments like debate competitions — two people argue in front of a crowd, and whoever wins gets the love and adoration of the crowd, and whoever loses goes home defeated and shamed. I guess that’s better than seeing arguments as threats of physical violence, but I still prefer the idea of arguing as a way to learn, to bounce ideas off of other people. Proving you’re smart is a pointless endeavor (unless you’re looking for a job), and is an example of what Stanford University psychologist Carol Dweck calls a “fixed mindset.” As the band Sparks once sang, “Everybody’s stupid — that’s for sure” [even though nobody wants to be called stupid]. What matters is going in the right direction — becoming less stupid, little by little.
But I think Smith’s ideal isn’t all that practical. To begin with, as Megan McArdle emphasizes, by calling one who disagrees with you stupid (even implicitly) “you have guaranteed that no one who disagrees with you will hear a word that you are saying.” Thus “calling people stupid is simply a performance for the fellow travelers in your audience” as well as a means of asserting superiority.
My sense is that the key element to this discussion is that most partisans see “their side” as not just true, but obviously true. It’s a by-product of bias blindness, or selective perception. We tend to see bias in others but not in ourselves. Therefore, our strongly held positions aren’t really debatable — they’re objectively and obviously true. After all, if we didn’t think our positions were true, we wouldn’t hold them. And (our thinking goes) since they are objectively true, anyone who makes the effort to try should be able to ascertain that truth. Our opponents are thus without excuse. Continue reading
If you were wrong about something important, how quickly would you want to know and how quickly would you want to do something about it? Unfortunately, the answer isn’t nearly as obvious as we’d like to think.
Mark Twain sagely noted that a lie can travel half way around the world while the truth is putting on its shoes. That’s because the truth is so much messier. Lies are created to be believable. They cater to our prejudices, whims, desires and hopes even when the truth cannot. Lies offer a good story when the truth does not. They are plausible when the truth is not. We often resist and even deny the truth. It is inherently unwieldy. It requires a careful sifting and analysis of facts in order to be discerned — we want deduction but are limited to induction most of the time. The truth is simply very hard to handle.
Of course, if we’re talking about relatively trivial matters (perhaps the distance from the earth to the moon) or about something we’re predisposed to believe anyway, we adjust our beliefs quite readily. But when truth doesn’t fit with what is important to us — when it matters — our perception of it gets caught up in who we perceive ourselves to be and in our vested interests. In those instances, attacking false information with data and related evidence often backfires, having the opposite of the desired effect. We like to think that straightforward education overcomes falsehoods, but things aren’t nearly that simple. This horrifying phenomenon — the backfire effect — was demonstrated once again recently in a study of the responses of parents to various forms of reporting that vaccines are not dangerous. Continue reading
Nearly every high school choral organization routinely performs anthems based upon some version of a familiar trope. The piece is designed to be trendy musically, even while being more than a bit late (when I was in school, each had an obligatory “hard rock” section). Meanwhile, the lyrics are an earnest and perhaps cloying ode to the ability of the young to create a better and brighter tomorrow. One such title from my school days was in fact “Hope for the Future.”
Unfortunately, the promise always seems better than the execution.
Despite the enormous (and most often negative) impact that our behavioral and cognitive biases have on our thinking and decision-making, the prevailing view is that we can’t do very much about them. In his famous 1974 Cal Tech commencement address, the great physicist Richard Feynman emphasized the importance of getting the real scoop about things, but lamented how hard it can be to accomplish. “The first principle is that you must not fool yourself – and you are the easiest person to fool.” Even Daniel Kahneman, Nobel laureate, the world’s leading authority on this subject and probably the world’s greatest psychologist, has concluded that we can’t do much to help ourselves in this regard.
But today — maybe — there might just be a tiny glimmer of hope (for the future). Continue reading
My December Research magazine column is now available online. Here’s a taste.
Innovation in financial planning typically starts with an idea. If enough people (or the right people) think it might be a good idea, it then moves to evidence-gathering for confirmation. But the entire endeavor—designed to try to confirm if the idea makes sense—is inherently prone to confirmation bias. We should be systematically and consistently looking to disprove the idea. Without a devil’s advocate with the specific mission to try to show why the idea is a bad one, without recrimination or criticism for doing so, many bad ideas will seem to be confirmed.
I hope you will read the whole thing.
Who’s the Easiest Person to Fool?
All nine of this year’s American Nobel Prize laureates shared a stage this week. Not surprisingly, economists Robert Shiller and Eugene Fama sniped at each other, if good-naturedly. But it was the winners from the harder sciences who got the best digs in at the economists. Martin Karplus, a chemist from Harvard, queried, “What understanding of the stock market do you really have?” He even went so far as to note that economics – “if one wants to call it a science” – seemed completely unable to explain market movement. There were others, but you get the point.
The obvious takeaway is a common theme — that economics is conflicted flawed divided wildly inconsistent worthless, a point that has been made by Nassim Taleb, Jeremy Grantham and many others. The idea is that economics desperately wants to be a science, with “physics envy,” but simply isn’t up to the task. And it’s hard to contest the point.
My disagreement is not so much with the main point (even though I think academic economics has a good deal to offer, with Shiller’s work being Exhibit A in my evidence cache), but with its corollary and the casual certainty with which that corollary is expressed. “Hard” scientists increasingly express a belief that their domain is the only bastion of reality available. They see a great divide between things we can know (science) and everything else, which is opinion at best and largely worthless. That the expression of it is so often humorless and irony-challenged makes it all the worse. Continue reading
Grantland has an excellent long form piece up about the lingering racism of Valdosta, Georgia as it relates to the mysterious death of a young athlete there. The article is excellent and worth reading for multiple reasons. For our purposes, I was particularly struck by the following two paragraphs. Continue reading
As regular readers are all too well aware, I am committed to data-driven analysis and investing. We’re suckers for stories, of course, and are ideological through-and-through, but the goal is to make sure that our investment decisions are based on real, quantitative evidence (at least to the extent possible).
That’s easier said than done, of course. We are prone to all sorts of cognitive and behavioral biases — perhaps most prominently confirmation bias — all of which threaten our analysis. We are also highly susceptible to bias blindness, the inability to see our own biases even when others’ are crystal clear. And now comes further evidence that our reasoning abilities are even worse than we thought. Continue reading