StoryShots

StoryShotsBeta

Back to Library

Superforecasting

by Philip E. Tetlock

A Summary by StoryShots

4.50
2+ ratings
The illusion of knowledge is more dangerous than ignorance.

Introduction

Most expert predictions fail spectacularly. Economists miss recessions. Political pundits botch elections. Intelligence analysts miscalculate wars. Yet a small group of amateur forecasters consistently crushes professional analysts. That's the core finding of Superforecasting by Philip E. Tetlock and Dan Gardner. Certain people predict the future with shocking accuracy, and the techniques they use have nothing to do with genius.

Why Expert Predictions Fail More Often Than Random Chance

Experts with PhDs and security clearances perform worse than random guessing in geopolitical forecasts. A twenty-year study tracked 82,361 predictions from political experts. The results were devastating: the more famous the expert, the worse their predictions. Experts fall in love with single narratives. They build careers on one big idea, then force every new event into that framework. The study revealed something unexpected: a tiny subset of forecasters were significantly better than experts, and they stayed better year after year. "The average expert was roughly as accurate as a dart-throwing chimpanzee." The difference came down to how they thought, not what they knew.

The Foxes Beat the Hedgehogs Every Time

Thinkers divide into two categories. Hedgehogs know one big thing: a single grand theory they apply to everything. Foxes know many small things: they draw from multiple perspectives and constantly update their beliefs. Foxes destroyed hedgehogs in forecasting tournaments. When oil prices dropped, hedgehogs doubled down on existing theories. Foxes asked what evidence would make them change their minds, then changed them. The fox advantage comes from intellectual humility. They treat beliefs as hypotheses, not identities. "Beliefs are hypotheses to be tested, not treasures to be guarded." The method foxes use to update those beliefs is shockingly mechanical.

The Fermi Estimation Method Turns Guesses into Forecasts

Superforecasters break impossible-sounding questions into solvable pieces. Asked to predict the likelihood of a coup in Thailand within six months, they don't guess. They ask: How many coups has Thailand had historically? What conditions preceded them? Are those conditions present now? They break the problem into base rates, reference classes, and specific evidence adjustments. This is Fermi estimation: named after physicist Enrico Fermi, who could estimate the number of piano tuners in Chicago by breaking the question into population, household pianos, and tuning frequency. The best forecasters update their predictions constantly as new information emerges. That mental habit separates amateur accuracy from expert failure. "Foxes update. Hedgehogs defend." If this changed how you think about decision-making, someone in your life probably needs to hear it too.

Final Summary

But the ten-step method that superforecasters use to unpack any prediction lives in the full version, along with the precise questions they ask before committing to any forecast. Superforecasting also reveals the psychological traps that destroy accuracy even when you're using the right method: scope insensitivity, availability bias, and the planning fallacy. This book is essential reading for anyone who makes decisions under uncertainty: strategists, investors, founders, or anyone tired of being blindsided by events they should have seen coming. We are putting together the full summary of Superforecasting by Philip E. Tetlock and Dan Gardner right now, with a visual infographic and animated video. You can follow the book in the StoryShots app to get it the moment it is ready.

Want a More Detailed Summary?

We don't have a detailed summary for "Superforecasting" yet. Vote for this book in the StoryShots app to help us prioritize creating a full summary with PDF, animations, and infographics!

Download on the App StoreGet it on Google Play