Among the many peculiar word constructions former Secretary of Defense Donald Rumsfeld is remembered for, there is one that applies to all forecasts of the future: the “unknown unknowns.” In his words, they are “the ones we don’t know we don’t know.” That phrase, incidentally, was already commonly used in the Defense Department and elsewhere in its shortened version, “unk-unk.” It refers to an uncertainty that is completely unanticipated, so that it can’t be factored into an analysis used to make a decision.
Unfortunately, for those who participate in the investment markets, it is precisely the unk-unks that guide the outcomes. “It’s tough to make predictions, especially about the future,” is an apropos saying attributed to famous baseball player and coach Yogi Berra. Even if he wasn’t the first to say that, the saying’s validity stands. Several studies of predictions by economists and security analysts have shown that their predictions—based on expert manipulation of the best available evidence—performed no better in telling us the course of events than did a simple coin flip.
As the advisors to the Tweedy, Browne mutual funds noted in their recent annual report, the eminent Financial Times of London carried blaring headlines in the spring and early summer of 2006, predicting a dire outcome for the stock market that year. Even as the last headlines appeared in July 2006, the market was already starting a new upswing that would lead to double digit gains for the year. “And this is but a small sample of the volumes of advice investors have been and continue to be subjected to on a daily basis,” Tweedy, Browne’s managers wrote.
Another saying attributed to Berra spoke precisely to unexpected outcomes: “The future ain’t what it used to be.” President Bush once used the malapropism “misunderestimated,” a cross between misunderstood and underestimated. It is not a bad word, because it pretty accurately describes our innate limitations on estimation, which is vital to our (lack of) ability to predict.
Nassim Nicholas Taleb discussed this phenomenon in his recent book, The Black Swan: The Impact of the Highly Improbable (2007, Random House). Studies of how people estimate probabilities, coupled with how confident they are of their abilities to do so, have shown “we overestimate what we know, and underestimate uncertainty, by compressing the range of possible uncertain states,” Taleb wrote. In other words, our predictive minds can’t grasp the range of potential outcomes, and we almost certainly miss the rare but potential surprising or disastrous occurrences that come along to upset our apple carts.
Taleb offers another reason for our inability to predict unusual events: we are used to the familiar, the things we have experienced, and have trouble with the abstract things we have not—but could—come across. “Randomness and uncertainty are abstractions,” he wrote. “We respect what has happened, ignoring what could have happened.” If that is the case, the best thing to do is lay off all predictions and be prepared for the best and worst at all times.