Underestimating the Density of the Fog

The story has essentially attained the level of holy writ, at least to those committed to data and evidence, such that it now seems almost too good to be true. The quick-and-dirty version of the tale is that stats geeks with computers, like those former player and broadcaster Tim McCarver called “drooling eggheads,” outsmarted and outmaneuvered the stupid yet arrogant “traditional baseball men” who ran our most traditional sport at the professional level and who thought they knew all there was to know about the game. Thus, it is said, everything the old-time baseball men thought they knew about evaluating players and teams has been found wanting, not that those whose day has passed, committed to wizardry and witchcraft as they are, have recognized it.

This revolution – as shocking as it has been comprehensive – is said to have brought about the ultimate revenge of the nerds. The geeks now run the baseball show, having moved the level of analytical precision involved in running teams and evaluating players from zero-to-sixty in a flash. The new breed of “baseball men” aren’t grizzled scouts looking for “five tool guys” but, rather, Ivy League educated experts in computer modelling and statistical analysis who use those skills to determine who to scout, who to sign, who to play and how to play. The prevailing narrative describes this new contingent as dominating professional baseball at every level, down to the depths of the independent minor leagues.

Is the analytics overhaul of baseball proper as complete and comprehensive as the telling claims? No. The real story is much more interesting and enlightening than that.

Baseball is particularly amenable to the use of statistical analysis because it offers large sample sizes, discrete individual-performance measures (such as plate appearances, pitches, and the like), and ease of identifying positive results (such as winning, home runs, and the like). However, when humans are involved – and baseball is as human as can be – interpretation of the underlying data is highly complicated.

Great interpretation of difficult data sets, especially those involving human behavior, involves more sculpting than tracing. It requires great skill, imagination and even a bit of whimsy as well as collaboration as to whether the various interpretive choices are acceptable (not to say the right) ones. That’s why we understand reality better with respect to the natural sciences than in the social sciences. As ever, information is cheap but meaning is expensive.

To lead off, let’s recall that if it seems too good to be true, it usually is. To see what I mean by that in the context of our story will require some in-depth analysis of its own, starting with more than a bit of background information and history.  Continue reading

Advertisement

Horrid Facts, Stubborn Facts

September 11.

Two words. Powerful emotions. Searing memories. Evocative stories. Fifteen years.

Fifteen years ago, on Tuesday, September 11, 2001, I was sitting in front of a Bloomberg terminal when the first, cryptic hints about trouble at the World Trade Center crawled across the bottom of my screens (I think). I had been scheduled to fly to New York the day before and had reservations at the Marriott World Trade Center (3 WTC), which would be destroyed when the Twin Towers collapsed. Instead, I decided to stay home and go to a “Back to School Night” presentation at my kids’ school. As the day’s events unfolded, I recalled having been on the Merrill Lynch fixed income trading floor at the World Financial Center doing a STRIPS trade when I heard and felt the February 26, 1993 World Trade Center bombing. I was really glad I didn’t get on that plane to New York.

My little, not so evocative story is insignificant within the context of the tragic losses, horrible evil and incredible heroism of the “American epic” to which that day bore inexorable witness. But it is what happened to me. It provides context and a framing device to help me remember and think about what transpired and what it means. It is emotional to think about still. But many other stories are far more important.

The image reproduced below is central to several other converging stories from that dreadful day.

9-11-1

Continue reading

Evidence is Not Enough

Note: I will be attending the terrific Evidence-Based Investing Conference on November 15, 2016 in New York. You should too.Evidence Isn't Enough 1.pngThere is a new and growing movement in our industry toward so-called evidence-based investing (which has much in common with evidence-based medicine). As Robin Powell puts the problem, “[a]ll too often we base our investment decisions on industry marketing and advertising or on what we read and hear in the media.” Evidence-based investing is the idea that no investment advice should be given unless and until it is adequately supported by good evidence. Thus evidence-based financial advice involves life-long, self-directed learning and faithfully caring for client needs. It requires good information and solutions that are well supported by good research as well as the demonstrated ability of the proffered solutions actually to work in the real world over the long haul (which is why I would prefer to describe this approach as science-based  investing, but I digress).

The obvious response to the question about whether one’s financial advice ought to be evidence-based is, “Duh!” Then again, investors of every sort – those with a good process, a bad process, a questionable process, an iffy process, an ad hoc process, a debatable process, a speculative process, a delusional process, or no process at all – all think that they are evidence-based investors already. They may not describe it that way specifically. But they all tend to think that their process is a good one based upon good reasons. Nothing to see here. Move right along.

Nearly as problematic is the nature of evidence itself. The legal profession has been dealing with what good and relevant evidence is for centuries. According to the Federal Rules of Evidence (Rule 401): “Evidence is relevant if: (a) it has any tendency to make a fact more or less probable than it would be without the evidence; and (b) the fact is of consequence in determining the action.” That’s a really low bar, which explains why so much more than merely evidence is implicit within the rubric of evidence-based investing.

And therein lies the problem. Committing to an evidence-based approach is a great start, a necessary start even, to sound investing over the long-term. But it’s not enough…not by a longshot. As philosophers would say, it’s necessary but not sufficient. Most fundamentally, that’s because:

  • The evidence almost always cuts in multiple directions;
  • We don’t see the evidence clearly; and
  • We look for the wrong sorts of evidence.

I’ll examine each of these related issues in turn.  Continue reading

Alternatives to Being an Evidence-Based Financial Advisor

Evidence-basedThere is a new and growing movement in our industry toward so-called evidence-based investing (which has much in common with evidence-based medicine). As Robin Powell puts the problem, “[a]ll too often we base our investment decisions on industry marketing and advertising or on what we read and hear in the media.” Evidence-based investing is the idea that no investment advice should be given unless and until it is adequately supported by good evidence. Thus evidence-based financial advice involves life-long, self-directed learning and faithfully caring for client needs. It requires good information and solutions that are well supported by good research as well as the demonstrated ability of the proffered solutions actually to work in the real world over the long haul (which is why I would prefer to describe this approach as science-based investing, but I digress).

The obvious response to the question about whether one’s financial advice ought to be evidence-based is, “Duh!” But since all too few in the financial world practice evidence-based investing, we ought carefully to look at the possible alternatives to being an evidence-based advisor. Here is a baker’s dozen of them for your thoughtful consideration. If I’ve missed any I’d appreciate your letting me know. Continue reading