We are bad at making predictions. Often no better than a chimpanzee throwing a dart. Whether it’s predicting how well a stock will perform or when we’ll launch a new product. Predicting the future is very difficult.
We discussed this problem a few weeks ago here:
It’s not just “normal” people, though. Even “experts” have a dismal record of forecasting the future. Why is that? And what can make us better?
These are some questions that Philip Tetlock and Dan Gardner explore in their book Superforecasting: The Art and Science of Prediction.
Overview
Superforecasting is based on the findings from the Good Judgment Project, a large-scale research study involving thousands of ordinary people who have been asked to make predictions about various world events over the course of many years. The study revealed that a small subset of participants, who the authors refer to as "superforecasters," consistently make more accurate predictions than others, including experts with years of experience.
The book examines what sets these superforecasters apart, exploring their habits, thinking styles, and techniques. It argues that all of us can improve our forecasting abilities by adopting the right mindset and methods. It also emphasizes the importance of scrutinizing assumptions, adopting an outside perspective, gathering diverse information, being willing to change one's mind, and continuously refining predictions based on new evidence.
So what are some of the key takeaways we can apply?
Takeaways
Thinking in Probabilities
Superforecasters think in terms of probabilities rather than certainties. They avoid binary thinking (i.e., something will or will not happen) and instead assign likelihoods to different outcomes.
For example, instead of saying "there will be a recession next year," a superforecaster might say "there's a 35% chance of a recession next year." This allows them to adjust their predictions as new information becomes available.
Thinking in probabilities isn’t natural, however. Our minds haven’t evolved to handle probabilities, as the book explains. We’ve evolved with a binary setting of “yes, that’s a lion and I need to run” or “no, that’s not a lion.” And sometimes a third setting of “maybe it’s a lion, so I’ll stay alert.”
The book gives the example of US intelligence and President Obama assessing the likelihood of Osama bin Laden being at a compound in Pakistan in 2011. While the intelligence analysts were all assessing the probabilities based on the data (anywhere from 30% to 80% likelihood), some leaders wanted certainty (100%) or they’d default to maybe (50%). They weren’t comfortable with a range of probabilities.
I’ve often found this to be the case as well. While we’re not making decisions at the same level as President Obama, I’ve found many leaders I work with have issues when I talk about confidence levels or probabilities.
I’ve found, much like leaders dealing with the intelligence data, that corporate leaders also crave certainty. When we say there is a 60% likelihood of delivery based on what we know, they’ll either see that as guaranteed delivery (more than 50%) or a guaranteed miss (not 100%). Simply because we cannot reduce uncertainty to zero.
Unfortunately, we can never reduce uncertainty to zero. But by making our discussion explicit, and thinking in probabilities, we can change how we operate and help others understand that we are dealing with uncertainty.
This is one of the key lessons in the book. Thinking and talking in probabilities changes how we work, how we forecast, and how we address uncertainty.
Active Open-Mindedness
Being open to changing one's mind is another key trait of superforecasters. They actively seek information that challenges their existing beliefs and are less prone to confirmation bias.
We’ve often talked about open-mindedness in this newsletter:
In the book, the authors give the example of switching the question around to avoid confirmation bias. When superforecasters were asked whether South Africa would grant a visa to the Dalai Lama, these forecasters approached the question from that angle, but also turned the question around to ask “what is the possibility that South Africa will not grant a visa to the Dalai Lama?”
We need to not only be open-minded, but seek views that challenge our own. Rather than seeking data to confirm what we believe, we seek data to disprove our beliefs or hypotheses. If we’re unable to disprove them, we have a strong case in favor.
Keep reading with a 7-day free trial
Subscribe to Prodity: Product Thinking to keep reading this post and get 7 days of free access to the full post archives.