It is a classic break-up excuse – “It’s not you, it’s me.” These five small words reek of phony compassion and are usually used to draw attention away from the real reason the relationship is being ended, often because the true reason is emotionally painful. The real message is typically much closer to “I don’t find you sufficiently attractive, but I can’t say that because then I’ll feel guilty. Oh, and by the way, I don’t really still want to be friends, either, so good riddance. I’m off to find someone as perfect as I am.” This excuse has become such a cliché that almost nobody buys it anymore. Why should they? We’ve all heard perpetrators of INYIM turn right around and tell anyone who will listen what the ex’s faults and failings were as soon as he or she is out of earshot.
But George has a point when he says, “Nobody tells me it’s them not me, if it’s anybody it’s me.” The sad truth is that a shocking amount of the time the crux of the problem – any problem – is us and not them.
Ben Roethlisberger is a great quarterback. He was selected 11th overall in the 2004 NFL Draft by the Pittsburgh Steelers and was immediately touted as the team’s franchise quarterback by then coach Bill Cowher. He became the youngest Super Bowl-winning quarterback in NFL history, leading the Steelers, in only his second professional season, to victory over the Seattle Seahawks in Super Bowl XL at the age of just 23. He later orchestrated one of the great late game comeback victories in Super Bowl history in Super Bowl XLIII.
In all, Big Ben has led the Steelers to 20 playoff games, three Super Bowls and two Super Bowl championships. He has been selected to five Pro Bowls, including the last three in a row. Roethlisberger has the fourth highest career winning percentage as a starter among quarterbacks with a minimum of 100 starts. He is top ten all-time in both passing touchdowns and passing yards.
In college, Roethlisberger was a three-year starter at QB and star at Miami of Ohio. In his three years there, he threw for 84 touchdowns and 10,829 yards, including a Mid-American Conference record 4,486 yards as a junior before declaring early for the NFL Draft. As a RedHawk, Roethlisberger broke 10 single-game, season and career records. He also tied the mark for most touchdown passes in a game with five, which he did twice. He closed out his college career in style, too, throwing for 376 yards and four touchdowns to lead his 14th ranked team to its 13th straight win in the GMAC Bowl over Louisville.
Roethlisberger was a great quarterback with enormous potential at Findlay High in Ohio too. As a 6’5” high school senior, he threw for over 4,000 yards and an astonishing 54 touchdowns, including eight in one game.
In a road game at Napoleon that season, the home team scored to take the lead over Findlay and, when Ben took over with 24 seconds to go, many fans were heading for the parking lot. But Roethlisberger threw a 50-yard pass up the left sideline and then a 17-yarder into the right corner of the end zone for the win with 1.4 seconds left. He was a terrific high school basketball and baseball player too.
So why did Big Ben end up playing his college football at Miami of Ohio, a local mid-major, instead of a traditional football power? Two obvious potential explanations don’t hold up to any sort of examination. Ben was not a late bloomer and his coach, now an Ohio state Senator, was not an idiot. In fact, his coach had an excellent record and understands quarterbacking too as he had played QB in college at Kentucky.
“In high school, you don’t know who’s going to be a pro yet,” said Cliff Hite, Roethlisberger’s high school coach. “You still have kids that are 5’7” and weigh 145 pounds that will give you every ounce of energy and everything that you could possibly desire.
“In Ben’s case, you knew Ben was going to be special.”
But here’s the thing. Roethlisberger was barely recruited before his senior year of high school. And by that time, the big-time schools already have their QB recruits locked down.
So what happened? The explanation is simple. Big Ben was not recruited before his senior year because he did not play quarterback until then, even though he had been a prolific JV passer. For his junior season, he was moved to wide receiver and, not surprisingly, played it very well, catching 57 passes and making the All-District team.
That sounds unbelievable, that one of the great QBs in the world did not get on the field at that position until his final year of high school, until you realize that the Findlay High QB Roethlisberger’s junior year was the coach’s son, who is a year older. Once more for emphasis: Ben Roethlisberger, two-time Super Bowl winner and Pro Bowl quarterback of the Pittsburgh Steelers, did not play QB in high school until his senior year because the coach’s son – one year older – was the starter. Ironically, the coach’s son went on to have a fine college career at Division III Denison as (get this) a wide receiver.
Roethlisberger’s coach was no doubt certain that he was making an entirely objective evaluation and doing what was best for the team. “I was brutal to my son because I thought he had to prove he earned it,” Hite asserted. He still thinks he made the right call (of course). When we possess the added information of who the starting QB was in relation to the coach, the whole situation makes perfect sense. Indeed, in that context, the otherwise apparent absurdity surprises precisely none of us.
Over the past three decades or so, behavioral economics has done a terrific job at the beginnings of an outline as to what our various behavioral and cognitive risks look like. We have a much better understanding of ourselves today, even if we do not necessarily like it much.
By this point, every investment professional and would-be professional has at least a passing knowledge of behavioral finance and its lead actor, confirmation bias, whereby (as it is usually posited) we see what we want to see, accept these desires as truth and act accordingly. Others acting to correct these misperceptions can actually reinforce them, via something called the “backfire effect.” Similarly, attempts to debunk believed myths that are false can also reinforce the myths because they keep repeating the untruth. It seems that people remember the false claim and forget that it’s a lie. If people feel attacked, especially about something they care deeply about, they tend to resist the facts all the more. Sadly, people will resist abandoning a false belief unless and until they have a compelling alternative explanation – in other words, a better story. We quite naturally and inherently prefer a false model of reality to an incomplete or uncertain but more accurate model.
Because of this confirmation bias, when we see information that confirms what we already think or believe, we ask ourselves if the information might be true. But if the information is disconfirming, we ask if it must be true, a wholly different standard. Due to our affinity for like-minded people, we seek out the people like us to provide echo chambers for our own stories and claims, claims that perpetuate themselves every time we hear them reverberated back to us. We are neuro-chemically confirmation bias addicts. As such, we tend to reach our conclusions first. Only thereafter do we gather purported facts and, even then, see those facts in such a way as to support our pre-conceived stories and conclusions.
In short, for all of us far too often, believing is seeing.
Confirmation bias comes in three primary flavors. Its standard expression, as noted above, is our tendency to notice and accept that which fits our preconceived notions and beliefs. Last autumn’s presidential campaign season provided daily examples. We routinely accept or explain away the foibles of the candidate we support while jumping all over those of the opposition. One person’s depravity and slander is another’s obvious fact. Each side thinks they have chosen the right hero in a fraught morality play with the highest of possible stakes.
However, confirmation bias also includes seeing what we expect to see (as when we try to proofread something multiple times and read right over an otherwise obvious error) and seeing that which is in our interest to see. This last expression is often called “motivated reasoning.” The shocking and famous Simmelweis Reflex is a reflection of this phenomenon while Upton Sinclair offered perhaps its most popular expression: “It is difficult to get a man to understand something, when his salary depends upon his not understanding it!”
Confirmation bias is supplemented and made worse in that most of us think we are significantly better than average at most things (also known as illusory superiority or the Lake Wobegon Effect) and because we’re also susceptible to the “endowment effect,” which describes the extra value we place on things just as soon as they become ours. A new study has even found that simply asking participants to imagine that a theory is their own biases them to believe in the truth of that theory.
Traditional economic theory insists that we humans are rational actors making rational decisions amidst uncertainty in order to maximize our marginal utility. Sometimes we even try to believe it. But we aren’t nearly as rational as we tend to assume. A wealth of research has established clearly that homo economicus is a myth – a powerful and satisfying myth to be sure, but a myth nonetheless. Our reasoning is suffused with emotion, what researchers often call “affect,” to such an extent that it is virtually impossible to parse the difference. Indeed, these overriding feelings about people, things, and ideas arise much faster than our conscious thoughts, well before we are aware of them. Reasoning comes later and requires careful deliberation, but emotional overlay remains, if often implicit. And the more we care, the more emotion remains.
Believing is seeing.
We all are and remain biased, ideological and inherently tribal. It can be dangerously difficult to bear in mind that we are rarely as right and our motives as pure as we tend to assume. As I have written many times, we like to think that we are like judges, that we carefully gather and evaluate facts and data before coming to an objective and well-founded conclusion. Instead, we cut straight to the chase. We are much more like lawyers, grasping for any scrap of purported evidence we can exploit to support our preconceived notions and allegiances. Perhaps worse, the more educated and intelligent we are, the more likely we are to reject facts – because we are smart and sophisticated enough to come up with seemingly plausible counter-arguments.
For example, consider a clever 2012 study. Experimenters played a video of protesters being dispersed by police and asked viewers whether the protesters were peaceful or violent. When the experimenters described the protesters as anti-abortion activists, viewers with liberal leanings saw the protesters’ actions as violent, whereas the more conservative subjects saw them as peaceful. On the other hand, when the experimenters said the (very same) protesters (in the very same video) were gay rights proponents, the liberals saw a peaceful protest and the conservatives saw a violent one. Every perception we have, no matter how seemingly objective it is to the witness is inherently infused with personal commitments.
As Meir Statman puts it, “People in standard finance are rational. They are not confused by frames, they are not affected by cognitive errors, they do not know the pain of regret, and they have no lapses of self-control. People in behavioral finance may not always be rational but they are always normal. Normal people are often confused by frames, affected by cognitive errors and know the pain of regret, and the difficulty of self-control.”
On our best days, when wearing the right sort of spectacles, squinting and tilting our heads just so, we can be observant, efficient, loyal, assertive truth-tellers. However, on most days, most of the time, we’re delusional, lazy, partisan, arrogant confabulators. That is the reality of being human.
Nobody really disagrees with that overview anymore. The research support for it is too strong and too comprehensive. Unfortunately, however, we do not think that we are susceptible to these problems personally. We can all see how nuts it was, but Coach Hite still thinks he made the right call by starting his son over Ben freakin’ Roethlisberger. Thus the most dangerous of all of our behavioral biases and cognitive weaknesses is this bias blindness, our general ability to see the flaws in others while being utterly blind to them in ourselves. It is our overarching human problem. As one prominent piece of research puts it:
“We cannot attribute [our adversaries’] responses to the nature of the events or issues that elicited them because we deem our own different responses to be the ones dictated by the objective nature of those events or issues. Instead ….we infer that the source of their responses must be something about them.”
In other words, if we believe something to be true, we quite naturally assume those who disagree have some sort of problem. Our beliefs are deemed merely to reflect the objective facts because we think they are true.
Believing is seeing.
Of course, that line of thinking does not convince anybody else. The research again:
“We are not particularly comforted when others assure us that they have looked into their own hearts and minds and concluded that they have been fair and objective.”
Of course not – they are biased (but I’m not). It is the same kind of thinking that allows us to smile knowingly when friends tells us about how smart, talented and attractive their children are while remaining utterly convinced as to the objective truth of the amazing attributes of our own kids. The problem is even more acute when the “answer” is counter-intuitive, and good investing is often wildly counter-intuitive (it is hard to sell when we are euphoric or buy when we are terrified, for example).
Daniel Kahneman won a Nobel Prize in 2011 for his foundational research into behavioral economics. His brilliant book, Thinking Fast and Slow, is part autobiography, part memoir and part research history. As he explains in the opening chapter, “The premise of this book is that it is easier to recognize other people’s mistakes than our own.” In fact, humans tend to foster a fundamental bias against other people’s biases. Researchers from five major universities created a measurement method to determine people’s level of bias towards others and themselves. The study results showed that only one out of 661 adults admits to being more biased than his peers.
Our bias is pervasive and our bias blindness is worse. INYIM indeed.
I have been working for some time on how we can try to counteract our bias blindness. I would be most interested in hearing what conclusions readers might have made in this regard, what processes readers have developed to deal with bias blindness, and what the results have been. Any takers?