I occasionally write for outside publications. When those publications are digital iterations of traditional media outlets, their editors tend to be especially vigilant about keeping posts to 400-600 words (as compared with my 4,000 word monstrosity at The Big Picture today). That’s because their research says that few readers will go beyond that length, no matter how good the content (which is a real problem for me since I tend to favor and write long and fairly involved pieces). Accordingly, information can readily be provided. But meaning, which takes careful sifting and evaluation of evidence, must remain rare indeed. Continue reading
Source: Larry Swedroe (CBS MoneyWatch)
On Monday, the United Nations’ Intergovernmental Panel on Climate Change issued an important report on climate change. From the report: “Impacts from recent climate-related extremes, such as heat waves, droughts, floods, cyclones, and wildfires, reveal significant vulnerability and exposure of some ecosystems and many human systems to current climate variability.” Moreover, for “countries at all levels of development, these impacts are consistent with a significant lack of preparedness for current climate variability in some sectors.” In other words, climate change is a big problem and we’re ignoring it. That’s important stuff, if not exactly news.
John Tamny of Forbes (already notorious for an appearance on The Daily Show — video here) vehemently disagrees. His article explaining why provides a veritable treasure trove of examples of how not to do investment analysis, or analysis of any sort for that matter. Continue reading
I have often noted (see here too) that we generally suck at math, to our great detriment. I have also noted that we are especially poor at dealing with probabilities. If a weather forecaster says that there is an 80 percent chance of rain and it remains sunny, instead of waiting to see if, in the aggregate, it rains 80 percent of the times when his or her forecast called for an 80 percent chance of rain, we race to conclude — perhaps based upon that single instance — that the forecaster isn’t any good. Data trumps our lyin’ eyes, but we don’t routinely see it (and even deny its efficacy).
Further evidence – as if it were needed – in support of my thesis has been offered this week in the reaction to Nate Silver’s projection that Republicans have a very real chance of gaining control of the Senate later this year. This forecast (“a Republican gain of six seats, plus or minus five”) is hardy earth-shattering to anybody who has been paying attention. The configuration of seats up for election favors Republicans and the Democratic President’s approval ratings are dreadful. There isn’t much reason to expect an upswing in Democratic support either, even though (obviously) almost anything could happen over the next few months. Dealing with probabilities necessarily means being wrong sometimes.
Investing is a probabilistic enterprise. Since certainty is even rarer than high risk-free returns, we’re left trying to make the best decisions we can based upon the knowledge we have. If we do that extremely well, we might be right most of the time, but still a long ways away from all of the time. The improbable — the highly unlikely even — happens and happens surprisingly often.
Take yesterday’s NFL action, for example. More specifically, consider the astonishing Vikings v. Ravens game in snowy Baltimore. Continue reading
The New York Times has a piece out trumpeting competitive balance in baseball and highlighting the fact that “the 2013 baseball playoffs include more teams from the bottom 10 in payrolls than the top 10.” Big spenders include the Dodgers, Red Sox and Tigers while the Pirates, Indians, Rays and Athletics are play-off teams despite being relative penny-pinchers. The Yankees, perennial spending leaders with a payroll in the range of $230 million this season, are watching at home while the Rays (spending very roughly a quarter of that amount) are still playing. The conclusion from the Times: “Market size matters, but not as much as shrewd management.” That analysis isn’t quite wrong, but it is more than a bit incomplete with respect to the data and leaves out a crucial element to the story: luck. Continue reading
As regular readers are all too well aware, I am committed to data-driven analysis and investing. We’re suckers for stories, of course, and are ideological through-and-through, but the goal is to make sure that our investment decisions are based on real, quantitative evidence (at least to the extent possible).
That’s easier said than done, of course. We are prone to all sorts of cognitive and behavioral biases — perhaps most prominently confirmation bias — all of which threaten our analysis. We are also highly susceptible to bias blindness, the inability to see our own biases even when others’ are crystal clear. And now comes further evidence that our reasoning abilities are even worse than we thought. Continue reading
A number of years ago, during George W. Bush’s second term and by sheer happenstance, I ended up playing a round of golf with a Navy SEAL Commander (half the SEALs train here in San Diego). Obviously, much of his job was classified and he was very circumspect in what he shared. However, when I asked where or how I could become better informed about foreign policy, he recommended Ron Suskind’s book, The One Percent Doctrine.
The “one percent doctrine” (also called the “Cheney doctrine”) was established shortly after 9.11 in response to worries that Pakistani scientists were offering nuclear weapons expertise to Al Qaeda. Here’s the money quote from Vice President Dick Cheney: “If there’s a 1 percent chance that Pakistani scientists are helping al-Qaeda build or develop a nuclear weapon, we have to treat it as a certainty in terms of our response. It’s not about our analysis … It’s about our response.”
Thus in Cheney’s view and per subsequent policy, the war on terror required and empowered the Bush administration to act without the same level of evidence or analysis as might otherwise be necessary. Continue reading
The “Semmelweis Reflex” is a metaphor for our reflex-like tendency to reject new evidence or new knowledge because it contradicts our established norms, beliefs or paradigms. It is named for Ignaz Semmelweis, a Hungarian obstetrician who found lasting scientific fame, but only posthumously.
Semmelweis discovered that the often-fatal puerperal fever (“childbed fever”), common among new mothers in hospitals, could essentially be eliminated if doctors simply washed their hands before assisting with childbirth. After observing that a particular obstetrical ward suffered unusually high instances of the disease and that doctors there often worked in the morgue right before aiding in childbirth but had not washed their hands in between, Semmelweis speculated that “cadaverous material” could be passed from doctors’ hands to patients, causing the disease. He thereupon initiated a strict regimen at his hospital whereby all who would assist in a birthing must first wash their hands with a chlorinated solution. As a consequence, death rates plummeted.
Semmelweis expected a revolution in hospital hygiene as a consequence of his findings. But it didn’t come.
In a recent post I made the obvious point that randomness has a very significant impact on investment performance. Of course, we’re all too ready to acknowledge bad luck when things go wrong, but when we succeed we want all the credit. In any event, my friend Cullen Roche also published the post at Pragmatic Capitalism, where it got a comment that I find interesting and deserving of a bit more examination. Continue reading