A few years ago, I read Michael Mauboussin’s “The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing.” It remains one of my dozen favorite investment-related books. I finally got around to reading his “Think Twice: Harnessing the Power of Counterintuition” on how we make—and all too often mismanage—decisions (including, but not limited to, investment decisions). He explains that mistakes are made because we fall victim to simplified mental routines that prevent us from coping with the complex realities. He also shows us how to recognize and avoid common mental errors.
In the introduction, Mauboussin highlights the importance of understanding that we often have to make decisions in the face of uncertainty, where, at best, we can only estimate the odds of potential outcomes. He writes that, sadly, especially when it comes to investments, too many judge the quality of a strategy based solely on the outcome—the strategy was good if the outcome was favorable. Instead, he says, we should judge the quality of the strategy by the quality of the decision-making process; and we must understand that quality decisions can lead to poor outcomes, and poor decisions can lead to good (but lucky) outcomes. The bottom line, he explains, is that “in a probabilistic environment, you are better served by focusing on the process by which you make a decision than on the outcome. Developing this habit, he continues, “opens a world of insight into decision making.”
Each of the eight chapters focuses on common, identifiable issues and the mistakes associated with them, and also shows that the mistakes are preventable.
Chapter 1: While we often view our problem as unique, others have usually faced the same problem. We can learn from the results of their decisions. If you want to know how something is going to turn out for you, look at how it turned out for others. For example, in corporate mergers and acquisitions, you can examine how similar deals have performed. Mistakes are often made because we are overconfident in our abilities, believing we are above average—the illusion of superiority. A related problem is that we tend to be too optimistic—we see our future as brighter than that of others. Another problem is that we believe chance events are subject to our control. It’s important to assess the potential dispersion of outcomes, not just the outcome you forecast.
Chapter 2: We fail to consider enough alternative options because we have models in our head that oversimplify the world. We are subject to “representativeness bias,” giving too much weight to the probability of something if we have seen it recently or if it is vivid in our mind. Availability encourages us to ignore alternatives, which, while allowing us to make quick decisions, often causes us to leave out alternative choices that could be better. Incentives and unconscious anchoring on irrelevant information contribute to this tunnel vision. To avoid the mistakes, explicitly consider alternatives, seek dissent, keep track of previous decisions, avoid making decisions when under emotional distress and understand incentives.
Chapter 3: While the research on prediction skills shows that experts in every field are not very expert at predicting the future when uncertainty is involved—the “wisdom of crowds” is more reliable—we often have an uncritical reliance on experts. “Experts” are people like us and are subject to all the same biases and mistakes. The evidence shows conclusively that the collective is most often better than even the best individual—as long as the collective isn’t acting as a crowd (in other words, there is diversity of opinion). Suggested solutions include seeking diversity of opinion and using technology—computers and collectives remain underutilized guides for decision making.
Chapter 4: “Situation influences our decisions enormously," Mauboussin writes. We underestimate how much we are influenced not only by others but by our own feelings. One of the many interesting stories in the book is of an experiment selling French and German wines. When French music was played, 77 percent of the sales were French wines, but when German music was played, 73 percent of the sales were German wines. Yet, 86 percent denied being influenced by the music.
Chapter 5: Cause and effect reasoning can fail when systems are complex because the whole is greater than the sum of the parts. Focusing on why individuals in a system do something—an investor in the market, an ant in a colony or birds in a flock—does not help explain how the entire system performs. Understand the rules that govern the entire system rather than the rules that drive the individual participants.