This book is about prediction and “Superforecasters” – people who are better at forecasting than better than above average forecasters.
Turns out the biggest thing preventing you from becoming a Superforecaster is not your inability to do high school math, but rather your inability to let go of closely held beliefs.
People who are emotionally attached to their beliefs see things through a skewed lens (as we all do) but are then unable to course correct their predictions in light of new evidence.
And it turns out this frequent incremental updating is one of the key skills Superforecasters look to cultivate.
A Case Study – Me (not a Superforecaster):
This book confirmed three of my deeply held beliefs.
The first is that the national media is corrupt and evil. I’ve cultivated this belief by reading countless books on the subject, good luck changing my mind now.
Especially after this book revealed new information to me: the media actively avoids putting % chance predictions on vague assertions they make.
They use language like “a fair chance of…” or “more than likely” or “almost certainly” to justify whatever they’ve said afterwards no matter what happens – this allows them to get away with repeatedly overstating/understating the importance of issues to the public.
Showing a correct prediction % next to national news pundits while they’re speaking would be great (but they’ll never agree that because they’re pure evil).
The second is that decentralized command is an important leadership principle (Thanks Jocko)
The third is that all scientific knowledge is tentative.
Please recognize the massive risk I am taking for saying this in public.
There is a very long line of angry people (who didn’t do very well in high school science, consume a lot of IYI media in adulthood, and all of a sudden became scientists in the last 7 months) queuing up to give my advertisers a piece of their mind.
The worst part of this book is that Tetlock questions the infinite wisdom of Nassim Taleb.
It is difficult for me to contend with the idea that the Black Swan theory is overly complex (aren’t all events somewhat predictable beforehand?) while also deeply wanting to pay $3000 to attend his RWRI conference (with hopes that attending a week long lecture taught by people so smart I can’t understand them will undo the years of binge drinking and return me to the intellectual apex of society making me as smart and self righteous as I was in high school where I intellectually badgered the C student coronavirus experts who seek to cancel me).
If anyone wants to start a low stakes gambling ring on Predictit hmu
Book was good. Little too long (aren’t they all)