My background is in Operational Research, which includes a lot on forecasting and mathematical prediction, so I was slightly disappointed that this isn't really covered here. Instead it gives us mostly ways that we instinctively get predictions wrong, so it's arguably more a psychology book that a mathematical one. There have been quite a few others that tread the path of uncovering our biases, for example with a mathematical approach in Jordan Ellenberg's How Not to be Wrong and with a more psychological twist in Richard Nesbitt's Mindware. But Yates has a particular focus on our tendency to assume linearity - that things will broadly continue the way they always have. By bringing in plenty of examples where this isn't the case - it's very often true in reality - including chaotic systems, he gives us a fresh viewpoint.
For me, the best chapter was 'reading between the lines', where Yates focuses most directly on non-linearity and really unpacks what's happening in some real world examples. And there were plenty of others with interesting examples and observations in other chapters - but I did have a few issues.
Occasionally Yates makes a statement that is hard to back up. Some of this, as is often the case with academics dipping a toe into popular science, was on historical matters - we are told 'It was will into the Middle Ages before the spherical view of the world became the predominant theory.' This just isn't true. I think he is also wrong about the millennium bug, calling it a self-defeating impact from predictions. The idea is that because of all the effort that was put in, there were few big problems, so people thought it was overhyped. I was consulting for the IT department of a global company at the time, and the reality was far more nuanced - the analysis was that it genuinely was overhyped, in that far too much was spent on checking non-critical systems that can have failed relatively painlessly, where a more effective approach would have been only to check mission- or safety-critical systems and leave the rest to fail and be fixed if necessary.
On other occasions, Yates provides a lack of explanation. For example, he introduces Benford's law, without telling us why it occurs. Some of the material was a little dull - I was particularly disappointed with the chapter on game theory, which failed to capture the intriguing nature of the subject and didn't explain enough for the reader to get their head around what was going on. Bearing in mind a lot of the book is based on psychological research, I was really surprised there was no mention of the replication crisis (surely in itself demonstrating a glaring lack of ability to predict the future) - I would be surprised if some of the studies he cites haven't failed to be capable of reproduction, or weren't based on far too small a sample to be meaningful. At the very least, this should be discussed in a book based on such studies.
The linearity bias isn't the only one that Yates covers, though most of the ones mentioned tie into it. As is always the case with books like this, it proved very interesting to read about, but I very rapidly forgot what all the biases are (again), and found it difficult to think of practical applications of what I've read. It's fine if you are a business or government wanting to deal with uncertainty (though even there, the book isn't a practical guide), but I think it's very unlikely to make much difference to the way we go about making predictions about the future in our everyday lives, beyond 'don't bother'.
Overall, this is an interesting topic and Yates presents a novel approach and does a good job of getting the reader to appreciate the dangers of relying on linearity. The book does have a few issues, but is still well worth a read.
Review by Brian Clegg - See all Brian's online articles or subscribe to a weekly email free here
Comments
Post a Comment