Confirming the Academic Stereotype

LA Times Festival of BooksIf it hadn’t happened to me, I don’t think I would have believed it.  It seemed like a caricature that SNL or even Rush Limbaugh might invent.  It was almost funny at the time, but upon further reflection isn’t funny in the least.  It’s pretty scary actually.

I attended the annual Los Angeles Times Festival of Books once again over the week-end. As always, it was engaging and interesting.  I learned some more about ideas and authors I wanted to learn more about and discovered some new ones too.  But one bizarre experience (sadly) takes the cake.

Continue reading


A New Paradigm?

Fifty years ago this month, Thomas Kuhn’s The Structure of Scientific Revolutions was published, and it remains one of the more influential books of our time.  It is also one of the most cited academic books of all time. If you haven’t read it or read it recently, you might pick up a copy of the new 50th anniversary edition

If you have ever heard or used the term “paradigm shift” then you have been influenced by Kuhn.  Before Kuhn, our views of science were dominated by ideas about how it ought to develop (the “scientific method”) together with a sense of narrative, of science marching forward inexorably and heroically. 

Where the standard account saw steady, cumulative “progress,” Kuhn saw movement and discontinuities – a set of alternating “normal” and “revolutionary” phases in which communities of specialists in particular fields are plunged into periods of difficulty and uncertainty. These revolutionary phases (e.g., the transition from Newtonian mechanics to quantum physics) correspond to great conceptual breakthroughs which are often ignored or rejected for long periods prior to ultimate acceptance and which form the foundation for succeeding phases in which the breakthrough has become the consensus. That this version of history seems no-big-deal now demonstrates how powerful his ideas have become. 

Per Popper, “normal science” is distinguished by the fact that it focuses upon refuting rather than confirming its theories. However, and consistent with more recent discoveries of our behavioral flaws and biases, “normal” scientists in reality spend most of their time trying to confirm what they already think — their paradigm.  We shouldn’t be surprised whenever confirmation bias rears its head. 

That Kuhn deemed his book a mere “sketch” (only 172 pages) is part of its charm and its power.  It is simple, straightforward and easy to understand.  It just makes sense.

A “paradigm” as an intellectual framework that makes research possible.  It’s clear (to me at least) that the finance world needs a new paradigm.  Its model-driven alleged rationality just plain doesn’t work very well.  It doesn’t fit the data and is inconsistent with experience at nearly every level.  The best of cutting-edge financial, economic and scientific research today is data- rather than theory-driven.  Being data driven is a focus of this blog (see, e.g., herehere, here and here), as the masthead proclaims. I hope that our next financial paradigm (not to mention economic and political paradigms) is predicated not upon some overarching theory, but rather upon that which can be demonstrated to work.  That’s more than enough of an intellectual framework for me.

Thomas Kuhn deserves as much.

Crackpots Work Alone

In conversation over the week-end at the Los Angeles Times Festival of Books, American virologist David Baltimore, who won the Nobel Prize for Medicine in 1975 for his work on the genetic mechanisms of viruses, commented that over the years (and especially while he was president of CalTech) he had received many manuscripts claiming to have solved some great scientific problem or to have overthrown the existing scientific paradigm to provide some grand theory of everything.  Most prominent scientists have drawers full of similar submissions, almost always from people who work alone and outside of the scientific community.

Unfortunately, none of these offerings has done anything remotely close to what was claimed, and Dr. Baltimore offered some fascinating insight into why he thinks that’s so.  At its best, he noted, good science is a collaborative, community effort. On the other hand, crackpots work alone.

I suspect that this idea is as true in our business as it is in science generally.  Working collaboratively and in community lessens the likelihood that our work will be fraught with the bias to which we are so prone and increases the likelihood that all analysis and any conclusions drawn therefrom will be questioned, checked and tested.

Of course, in this context “collaborative” must mean that there is a free and uninhibited exchange of ideas, viewpoints and conclusions.  It most decidedly is not a bunch of people agreeing vehemently with the boss or those with a seal of approval from the boss. In both the scientific and the investment process, principled, data-driven questioning and even disagreement should not only be tolerated, it needs to be encouraged.  Only then can some approximation of truth (or at least reality) be an expected outcome.

As Dr. Baltimore emphasized, “Science is about changing our understanding of something by investigating its behavior.” In this sense, investigating includes analysis, testing, reaching tentative conclusions and then repeating – perhaps again and again.  If what we think cannot be substantiated by data, evidence and experience, even after the most rigorous attempts to falsify it, we have no business claiming it to be true or investing as if it were true, no matter how intuitive, elegant or sales-worthy it may be.

Addendum:  This post came about out of a wonderful conversation with Tom Brakke, of the outstanding blog, The Research Puzzle, over lunch today.  Thanks, Tom.

Benford’s Law and Banks

Jialan Wang of Washington University in St. Louis has written a fascinating post about Benford’s Law.  Benford’s Law relates to numerical regularity; more numbers begin with 1 than 2, 2 than 3, 3 than 4, and so on. It postulates that the first digit of a number is 1 almost one-third of the time, and larger digits occur as the leading digit with lower and lower frequency, to the point where 9 as a first digit occurs less than one time in twenty. 

This result has been found to apply to a wide variety of data sets, including electricity bills, street addresses, stock prices, population numbers, death rates, lengths of rivers, physical and mathematical constants, and processes described by power laws (which are very common in nature).  Accordingly, Benford’s law is used to detect corporate fraud, in that deviations from it may indicate that a company’s books have been cooked.  Prof. Wang’s post provides fodder for three particularly interesting inferences. 

  1. Deviations from Benford’s law have increased substantially over time, suggesting that  accounting statements and thus company reporting are becoming less and less reliable;
  2. Deviations from Benford’s law are compellingly correlated with known financial crises, bubbles, and fraud waves, suggesting that bad data may be a bigger cause of such events than generally assumed; and
  3. There is currently a high level of deviations from Benford’s law within finance, suggesting that those of us who are suspicious of bank balance sheet problems well in excess of what is claimed within the industry (those with a “trust deficit“) may be onto something.