Statisticians rarely become superstars, but Nate Silver is getting close. This is the guy who writes the FiveThirtyEight.com blog for the New York Times and has correctly predicted the outcome of the last two presidential elections in virtually every one of the 50 states. But Silver is no political maven weaned on election trivia at his parents’ dinner table: he earned his stripes as a prognosticator supporting himself on Internet poker and going Billy Beane of the Oakland A’s (Moneyball) one better by developing an even more sophisticated statistical analysis of what it takes to win major league baseball games. And, by the way: Silver is just 34 years old as I write this post.
The Signal and the Noise: Why So Many Predictions Fail — But Some Don’t by Nate Silver
@@@@@ (5 out of 5)
The Signal and the Noise is Silver’s first book, and what a book it is! As you might expect from this gifted enfant terrible, the book is as ambitious as it is digestible. Written in an easy, conversational style, The Signal and the Noise explores the ins and outs of predicting outcomes not just in politics, poker, and sports (baseball and basketball) as well as the stock market, the economy, and the 2008 financial meltdown, weather forecasting, earthquakes, epidemic disease, chess, climate change, and terrorism.
Fundamentally, The Signal and the Noise is about the information glut we’re all drowning in now and how an educated person can make a little more sense out of it. As Silver notes, “The instinctual shortcut we take when we have ‘too much information’ is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies of the rest.” What else could explain why Mitt Romney was “shell-shocked” and Karl Rove was astonished by Romney’s loss in a presidential election that every dispassionate observer knew was going Obama’s way?
Silver asserts that “our predictions may be more prone to failure in the era of Big Data. As there is an exponential increase in the amount of available information, there is likewise an exponential increase in the number of hypotheses to investigate . . . But the number of meaningful relationships in the data . . . is orders of magnitude smaller. Nor is it likely to be increasing at nearly so fast a rate as the information itself; there isn’t any more truth in the world than there was before the Internet or the printing press. Most of the data is just noise, as most of the universe is filled with empty space.”
Sadly, it’s not just in politics that bias clouds judgment and leads to erroneous conclusions. “In 2005, an Athens-raised medical researcher named John P. Ioannidis published a controversial paper titled ‘Why Most Published Research Findings Are False.’ The paper studied positive findings documented in peer-reviewed journals: descriptions of successful predictions of medical hypotheses carried out in laboratory experiments. It concluded that most of these findings were likely to fail when applied in the real world. Bayer Laboratories recently confirmed Ioannidis’s hypothesis. They could not replicate about two-thirds of the positive findings claimed in medical journals when they attempted the experiments themselves.”
In general, Silver’s thesis runs, “We need to stop, and admit it: we have a prediction problem. We love to predict things — and we aren’t very good at it. . . We focus on those signals that tell a story about the world as we would like it to be, not how it really is. We ignore the risks that are hardest to measure, even when they pose the greatest threats to our well-being. We make approximations and assumptions about the world that are much cruder than we realize. We abhor uncertainty, even when it is an irreducible part of the problem we are trying to solve.”
There’s more: Silver relates the work of a UC Berkeley psychology and political science professor named Philip Tetlock, who categorizes experts as either foxes or hedgehogs (in deference to an ancient Greek poet who wrote, “The fox knows many little things, but the hedgehog knows one big thing.”). Hedgehogs traffic in Big Ideas and often hew to ideologies; these are the people who talk to the press and are frequently found on TV talk shows. Foxes are cautious types who carefully examine and weigh details before reaching conclusions. Not surprisingly, Tetlock found that “The more interviews that an expert had done with the press . . . the worse his predictions tended to be.”
In other words, Be afraid. Be very afraid. If the people who supposedly know what they’re talking about often really don’t, how can the rest of us figure out what’s going on?
For further reading
You might also enjoy Science explained in 10 excellent popular books (plus dozens of others)
If you enjoy reading nonfiction in general, you might also enjoy: