What Do You Believe In?

InvestmentBeliefssm2 (2)
We readily recognize that facts are not the same things as truth. Facts are true by definition (or they wouldn’t be facts), but they require analysis, understanding and interpretation to become useful and actionable, to become truth. Logically, we should decide what the facts are as objectively as we can and then interpret the facts to come to a consistent set of beliefs about them. But that’s not how it usually works. We try to jam the facts into our pre-conceived notions and commitments or simply miscomprehend reality such that we accept a view, no matter how implausible, that sees a different set of alleged facts, “facts” that are used (again) to support what we already believe. Since we quite readily see poor thinking in others but fail to see it in ourselves (on account of bias blindness) and live in a highly polarized society wherein commonly accepted facts are increasingly rare, beliefs (at least the beliefs of people who disagree with us) are not generally held in high regard.

I see at least two primary although not necessarily contradictory narratives concerning the nature and utility of belief in the modern world. On the one hand, we see people (always other people) who are hell-bent on claiming that belief (usually manifested ideologically) can and does trump all, that it can deny facts and evidence. This narrative plays out in religion, politics (right and left), sports and even investing — everywhere that ideology exists (which is essentially everywhere) — for we are ideological through-and-through. It is also well represented within popular culture.

American Hustle, nominated for ten Academy Awards, including Best Picture, began as a script by writer and producer Eric Singer entitled “American Bulls–t.” Who is conning whom? is a constant underlying question with the major theme expressed by a stripper from the American southwest turned would-be English aristocrat and played by Amy Adams: “The key to people is what they believe and what they want to believe. So I want to believe that we were real.” Of course, nothing in the film seems real at all.

Truth is irrelevant to “get[ting] over on all these guys” because “people believe what they want to believe.”

The movie is consistent with an ongoing trend in American thought suggesting that beliefs are generally powerful concoctions designed to fool others and ourselves, the province of knaves and charlatans. For example, the great American musical, The Music Man, is a paean to the power of belief. Personally (but usually only ascertained by others), we tend to test-drive purported beliefs as solutions until we find a set (based on facts or not) that works for us or until we find one that works better or reach the limits of our wallets, the law or common sense. Given the extent of our behavioral and cognitive biases, that last limit is not often reached.

Jesus had a good handle on the problem two centuries ago. “Why do you look at the speck of sawdust in your brother’s eye and pay no attention to the plank in your own eye?” In a famous 1726 sermon, Bishop Joseph Butler asked a penetrating question that we like to think is rhetorical:: “Things and actions are what they are, and the consequences of them will be what they will be: why, then, should we desire to be deceived?” The good Bishop, referencing the prophet Balaam, recognized “that strong passions, some kind of brute force within, prevails over the principle of rationality” and that we are all prone to “self-deceit,” to “a peculiar inward dishonesty.” Thus the problem isn’t new and there isn’t any evidence that it’s going away anytime soon either.

Sadly, recent evidence suggests that being smarter, more aware or more educated doesn’t seem to help us to make these sorts of decisions more effectively.  Indeed, they may actually make things worse. A recent study suggests that, in many instances, smarter people are more vulnerable to thinking errors, even basic ones.  Moreover, even “people who were aware of their own biases were not better able to overcome them.”

Expecting people (including ourselves) to be convinced by the facts is contrary to, well, the facts. W. Edwards Deming, perhaps the original data scientist, famously emphasized: “In God we trust; all others must bring data.” But even good data doesn’t always overcome entrenched beliefs. It’s easy for us to see that other people believe all sorts of stupid things but almost impossibly difficult even to consider the idea what we believe all sorts of stupid things.

Some of us believe in more egregiously stupid stuff than others, of course. As Isaac Asimov (among so many others) has pointed out, there is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that “my ignorance is just as good as your knowledge.” The idea that beliefs are bulls—t has strong evidential support (if only acknowledged in others).

Where the idea that truth need simply be proclaimed doesn’t work, another narrative suggests that truth may readily and easily be uncovered. Indeed, we eagerly embrace the idea that there is hope for a better process and better outcomes via science and its rigorous method, specifically designed to root out error.  This narrative suggests that science is the be-all and end-all of human knowledge and advance. Instead of belief dictating action, it’s science (although it’s deemed no less plug-and-play).

A recent essay by Steven Pinker provides a helpful example of this narrative in full flower: “the worldview that guides the moral and spiritual values of an educated person today is the worldview given to us by science.” Predictably (thanks to confirmation bias), the worldview Pinker believes that science has uncovered looks quite remarkably like his own. Imagine that.

But science offers no worldview and no easy answers. It is but a mechanism – a spectacularly successful mechanism to be sure, but a mechanism nonetheless. Pinker disagrees. As he would have it, humanism (the belief in “principles that maximize the flourishing of humans and other sentient beings”) “is inextricable from a scientific understanding of the world.”

How quickly we forget. Quite obviously, is and ought are quite different things and there is no rational way to derive one from the other. Moreover, the facts simply do not support Pinker’s contention. For example, it was what Engels described as Marx’s “scientific socialism” that gave the world gulags, repression, the Cultural Revolution, immense poverty, corruption and scores of millions of horrible and avoidable deaths. To be clear, science was not responsible for these atrocities (evil humans were), but it did not stand in the way either. No mere mechanism, no process can. People committed to science can be just as evil and just as good as anyone else. Science and talented scientists were responsible for both the eradication of small pox and the proliferation of nuclear weaponry. Progress is not always forward.

Moreover, it should be immediately obvious to even the most casual observer that science — a relatively recent invention — is hardly the only mode of human thought that can be made to work for us. If it were, it wouldn’t be so difficult to set up safeguards against the behavioral and cognitive biases that beset all of us. Science, and even explaining science, is really hard precisely because the scientific mindset isn’t remotely close to the “natural” mode of human thought.

There is much more to the human experience than measurement and the acquisition of knowledge about physical processes. Moreover, knowing them offers no certain, sure-fire answers. Other ways of thinking have also offered valuable insights into the human condition. Indeed, the American Experiment was founded upon beliefs that cannot be proven or even evidenced very well (if at all), ideals like “all men are created equal,” that “taxation without representation is tyranny,” and that we possess “certain unalienable Rights.”

None of this is to downplay science or the importance of science. It is the best (and perhaps only) means we have to ascertain objective fact. Even so, most of us think of scientific advance as essentially a linear progression.  The idea is that science develops by the careful gathering, accumulation and analysis of evidence such that new truths are built upon old truths (or the increasing approximation of theories to the truth), and in only the rare and unusual case, the correction of past errors. This progress might accelerate in the hands of a particularly great scientist, but the ongoing progress itself is thought to be all but guaranteed by the scientific method. It’s like those multi-initialed television procedurals in which science always saves the day by clearly and unambiguously identifying the killer in the final 15 minutes of the show. Ironically, the real-life evidence doesn’t support this sort of heroic advance. As the great physicist Max Planck wryly noted, science progresses one funeral at a time.

Scientists routinely acknowledge that they get things wrong, at least in theory (although I recall a long ago internet debate with a guy who claimed — with apparent straight face — that he had never made an irrational decision), but they also hold fast to the idea that these errors get corrected over time as other scientists try to build upon earlier work. And to a very large extent, that’s what happens. However, John Ioannidis of Stanford has shown that, as a matter of statistical logic, “most published research findings are probably false,” and subsequent reviews support that claim. In a commentary in Nature last year, scientists at Amgen disclosed that they had been unable to replicate the vast majority (47 of 53) of the landmark pre-clinical research studies they had examined. In a similar study, researchers at Bayer HealthCare reported that they had reproduced the published results in just a quarter of 67 seminal studies. Some of the most highly cited papers in stem cell biology cannot be replicated either. Despite rigorous protocols and a culture designed to promote the aggressive rooting out of error, scientists get things wrong – really important things – a lot more often than we’d like to think.

For Pinker, the advocates of evil movements offering at least a veneer of scientific basis are simply not truly Scotsmen scientific. That’s a handy escape route, allowing him to take credit for all the good while avoiding blame for what’s bad, but it’s not remotely credible. While Pinker is right that science is not our enemy, it isn’t our friend either. Science is an extremely useful and powerful but dangerous tool that must be used carefully and monitored closely.

Facts are inherently messy things and often get in the way of our favored preconceived notions. Even scientists fall prey to this tendency. They are prone to cognitive and behavioral bias, individually and collectively, just like the rest of us and despite the best efforts of science specifically designed to find and eliminate error. In his famous 1974 Cal Tech commencement address, the great physicist Richard Feynman talked about the scientific method as the best means to achieve progress.  Even so, notice his emphasis: “The first principle is that you must not fool yourself – and you are the easiest person to fool.”

Thus the narrative of constant scientific advance toward readily accessible true beliefs isn’t ultimately compelling either. But happily, there is a whole-lot more to the overall story. And an interesting and compelling story it is. Beliefs are dangerous and contentious, but also inherently necessary and potentially of great value.

For example, heroism of any sort involves somebody taking a big risk for a belief. Hitler would not have been defeated without contrary beliefs powerful enough to overcome fear and even death. More fundamentally, no matter how committed to data-based reasoning and analysis we are, facts can only take us part-way home. Facts demand analysis to become useful or to be actionable. The conclusions we reach and the positions we take are ultimately beliefs – beliefs that should be well-supported by the evidence of course, but still beliefs.

We are ideological creatures through-and-through. We prefer stories to data consistently, no matter how fanciful the story or how rigorous the data. It can be maddeningly difficult for us to separate fact from belief. Thus it is very dangerous business indeed to lose sight of what works in order to massage our egos, feel comforted or score ideological points.

Investing is hard enough without trying to foist an ideological overlay to it. As my friend Tom Brakke puts it, “[A]s an industry we waste a lot of time borrowing and reinforcing conceptual structures that are already formed rather than trying to shoot holes in them.  And most strive too hard to find analogies when anomalies are staring them in the face and going unrecognized or unexamined.” Both markets and ideologies can be unforgiving. We can readily lose at both, but we can’t win at both over the long-term. We should believe in and only focus upon what works.

Even so, like most of us I think, I desperately want the Truth with a capital-T. I want to follow the instructions and get the desired result. Every. Single. Time. I want to put my money into the machine and get my soda in return. I want to take the puzzle pieces out and have them all fit together perfectly to create a beautiful picture with none left over. I want to take the red pill. I want the Truth.

We so badly “want to know what it is.”

The idea of making decisions in the midst of uncertainty with uncertain outcomes despite our best efforts is a terrifying but very real one. As Neo learns in The Matrix, the Truth is a costly and difficult master. Even worse, we don’t have a red pill option. In our world, at least most of the time, truth is small-t truth even when we think we’ve found some. We want deduction but are stuck with induction. We want certainties but are left with no more than probabilities.

Investing, like much of life, combines elements of science and art and is infinitely infuriating and interesting as a result. Zach Lowe of Grantland made this general point in a basketball context earlier this week.

The interplay between math and team personnel is part of what makes basketball interesting. Math can give you answers that are correct in absolute terms, but the construction of a team plays a role in dictating whether a specific roster can execute those mathematically optimal plans.

In the real world, truth tends to be tentative, provisional and necessarily subject to revision and requiring art to interpret. That’s partly because, as noted, facts are inherently messy things. We argue about what the facts are and then argue about the interpretation of those facts. Yet the leading modern narratives about the nature and utility of belief (beliefs are a priori and trump all/the right beliefs are readily ascertainable via science) both seem to reject this reality, at least in practice.

I don’t.

Out of this messiness we must cobble together our investment beliefs in order to try to create a coherent investment process. I’d love to be able to act upon facts alone, of course, assuming I can come to a fair approximation of them. But facts without interpretation are still useless. So over the next few weeks I will be trying to decipher and analyze what my investment beliefs are and should be. Good investing demands no less, even though the process is extremely difficult and challenging. Lots of seemingly sacred cows will be at risk. I hope you’ll enjoy the ride with me.

6 thoughts on “What Do You Believe In?

  1. In reference to truth in science, this like all else is more complex than you would normally think. There are many things at play, not much of which have to do with the search for capital T Truth.

    1.) Scientists in most research institutions rely on ‘soft money’ to fund their labs. In other words, the university tells them how much they can pay themselves, but it’s up to the researcher to come up with the money, primarily from getting grants. Unfortunately this means a researcher needs to publish results with positive findings (e.g., you can’t publish a paper showing that a cancer drug doesn’t work better than the standard of care). This puts huge pressure on principal investigators (PIs) to publish, something, anything. With the ever-decreasing downward pressure on grants, the situation just keeps getting more desperate. The ‘pay-line’ which is the percentage of submitted grants that actually get funded is around 7% right now, and even if you get funded there will likely be reductions in your funding later on. Most scientists are in the business of getting grants, not doing science. Sort of how politicians are in the business of getting donations, rather than legislating.

    2.) The journals that publish results are by and large commercial ventures that want to publish ‘sexy’ results to keep their ‘impact factor’ high, which directly correlates to advertising costs. Michael Eisen has fought against this for years now, and may be making progress. See a recent interview at simply statistics (http://simplystatistics.org/2013/12/12/simply-statistics-interview-with-michael-eisen-co-founder-of-the-public-library-of-science/).

    3.) The peer-review process is a joke. You submit your manuscript, and the editor at your journal of choice (after deciding the paper is sexy enough) passes the manuscript out to 3-4 people who will do a review. In practice, the people who do reviews are either grad students or post-docs in big labs (the PI isn’t going to do it), or somebody else that the editor could find via bulk email searches. Most people don’t have the time to really critically review a paper, or they don’t have the expertise, or their expertise only covers a small portion of what the paper covers (e.g., a cancer biologist might get a cancer clinical trial paper, but she almost certainly won’t understand the statistics. Or a statistician gets the paper, but doesn’t understand the biology), so the reviewer just looks at the part they understand and then sends in their comments. The author makes some changes, and the manuscript gets published. Once a paper is published, unless there is something really egregious, it becomes part of the canon. And if it gets cited repeatedly it becomes even more entrenched, even if there are multiple papers later on that clearly disprove the results.

    One suggestion that Michael Eisen has made that makes sense to me is an ongoing ‘rating’ system for published papers, sort of like an Amazon review of a product.

    4.) Studies performed by big pharma (or academics who get money from pharma) have the same drawback as #1 above. The pressure to develop a new drug is intense, and the more money spent on a given drug, the higher the pressure for success. Rudimentary game theory will tell you how that plays out.

  2. Pingback: Sunday links: ideological overlays | Abnormal Returns

  3. “But facts without interpretation are still useless.”

    – No its just the other way around. It’s your interpertation which makes truth an illusion of your mind. Dont interpretate just react on the facts.

    • The fact that, for example, a stock made a big move today tells us absolutely nothing about what, if anything, we should do in response. Your plan to “just react on the facts” is meaningless unless and until you have a framework and a process for analyzing the facts.

  4. Pingback: Looking Towards the Future Part II | Prudent Trader

Leave a comment