In Liz Weston’s most recent Money Talk column, which I read in the Sunday Los Angeles Times, a reader was concerned about retirement income and spending:
In a recent column you repeat advice I have often read that withdrawing about 3% of my investment capital will reduce the chances of my running out of money in retirement. But that doesn’t make sense to me. I have been retired for over 19 years and I have sufficient data now to extrapolate that I could live for 100 more years with so meager a drawdown because, through good and bad times, my earnings after inflation and taxes always exceed 3%. If I am missing something, I must be extraordinarily lucky because it hasn’t hurt me yet, and at age 77 I think it unlikely to do so in my remaining years. Can you explain this discrepancy between my experience and the consequences of your advice?
Ms. Weston gives the appropriate answer: “Sure. You got extraordinarily lucky.” She properly distinguishes between those who retire into good markets (lucky) and those who retire into tough markets (unlucky) and correctly notes why and how the early retirement years are particularly important. While I think her implicit suggestion that “3%-to-4%” withdrawals are safe is too aggressive, her approach and her broad-brush analysis are correct.
What particularly interested me was a phrase in this sentence, which I have highlighted: “I have been retired for over 19 years and I have sufficient data now to extrapolate that I could live for 100 more years with so meager a drawdown because, through good and bad times, my earnings after inflation and taxes always exceed 3%.”
The reader doesn’t have “sufficient data” after 19 years – not even close. We would need far more data than that to draw even tentative conclusions. We all suffer and are threatened by the same problem. We are pretty lousy at math generally and we truly suck at probability analysis.
As most of you know (and as my masthead proclaims), I advocate a data-driven approach to investing and retirement planning. A major problem with this approach (as I have written before; see here and here too) is that we all have difficulties engaging with issues mathematically and probabilistically.
We are all prone to innumeracy, which is “the mathematical counterpart of illiteracy,” according to Douglas Hofstadter. It describes “a person’s inability to make sense of the numbers that run their lives.” Although Hofstadter coined the term, mathematician John Allen Paulos popularized the concept with his book, Innumeracy: Mathematical Illiteracy and Its Consequences. While illiteracy strikes mostly the uneducated, we are all prone to innumeracy.
For example, most people would consider it an unlikely coincidence if any two people would share the same birthday in a room with 23 people in it. People would generally look at it like this: since one would need 366 people (in a non-leap year) in a room to be certain of finding two people with the same birthday, then it seems to make sense that there is only a 6.28% chance of that happening with only 23 people in a room (23 divided by 366). However, 99% probability is actually reached with just 57 people in a room and 50% probability exists with only 23 people (see more on the “birthday problem” here).
In the investment world, we intuitively tend to think that if we start with $1,000 and suffer a 50 percent loss on Day 1 but make 50% back on Day 2 (day-to-day volatility being exceptionally high, donchaknow), we’re back to even. However, were that to happen, our $1,000 would be reduced to a mere $750 (more on the “arithmetic of loss” here). Similarly, a sum of money growing at 8 percent simple interest for ten years is the same as 6 percent (6.054 percent to be exact) compounded over that same period. Most of us have trouble thinking in those terms.
These examples are pretty (pardon the pun) simple. When things get more complicated we can really go off the rails, especially when the answer seems straightforward. To illustrate, if you have two children and one of them is a boy born on a Tuesday, what is the probability you have two boys? If you do not answer 13/27 or 0.481 — as opposed to the intuitive 1/2 – you’re wrong (to find out why go here).
The inherent biases we suffer (as discovered by behavioral finance) make matters worse. For example, we’re all prone to the gambler’s fallacy – we tend to think that randomness is somehow self-correcting (the idea that if a fair coin is fairly tossed 9 times in a row and it comes up heads each time, tails is more likely on the tenth toss). However, as the commercials take pains to point out, past performance is not indicative of future results. On the tenth toss, the probability remains 50 percent.
We also tend to suffer from availability bias and thus value our anecdotal experience over more comprehensive data (see the cartoon below, from XKCD). For example, the fact that most of your friends use MySpace is not enough evidence to conclude that it’s a good product (much less a good investment).
The conjunction fallacy is another common problem whereby we see the conjunction of two events as being more likely than either of the events individually. Consider the following typical example. A group of people was asked if it was more probable that Linda was a bank teller or a bank teller active in the feminist movement from the following data points: “Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.” Fully 85 percent of respondents chose the latter, even though the probability of two things happening together can never be greater than that of the events occurring individually.
Now suppose that Company X has a workforce that is only 20 percent female. The base-rate fallacy would suggest that the company is discriminatory. But further analysis is required. If the applicant pool was only 10 percent female, Company X might actually have an exemplary record of hiring women. If you want to learn more in this area, you might start with this paper on teaching statistics.
In his 1974 Cal Tech commencement address, the great physicist Richard Feynman talked about the scientific method as the best means to achieve progress. Even so, notice what he emphasizes: “The first principle is that you must not fool yourself–and you are the easiest person to fool.” The examples above make Feynman’s point. It’s easy to fool ourselves, especially when we want to be fooled – we all really like to be right and have a vested interest in our supposed rightness. If we are going to be data-driven (that that’s a very good thing), we need to check our work and our biases very carefully anyway, and especially because we suck at math and are even worse at probability.