Skip to main content

How to Expect the Unexpected - Kit Yates ****

The topic here is one everyone is interested in - getting a better handle on the future, and it's an interesting read. Arguably Kit Yates' title is a touch misleading. This isn't a 'how to' book - after reading it, you won't be any better at doing anything, but you may be less likely to make some popular errors.

My background is in Operational Research, which includes a lot on forecasting and mathematical prediction, so I was slightly disappointed that this isn't really covered here. Instead it gives us mostly ways that we instinctively get predictions wrong, so it's arguably more a psychology book that a mathematical one. There have been quite a few others that tread the path of uncovering our biases, for example with a mathematical approach in Jordan Ellenberg's How Not to be Wrong and with a more psychological twist in Richard Nesbitt's Mindware. But Yates has a particular focus on our tendency to assume linearity - that things will broadly continue the way they always have. By bringing in plenty of examples where this isn't the case - it's very often true in reality - including chaotic systems, he gives us a fresh viewpoint.

For me, the best chapter was 'reading between the lines', where Yates focuses most directly on non-linearity and really unpacks what's happening in some real world examples. And there were plenty of others with interesting examples and observations in other chapters - but I did have a few issues.

Occasionally Yates makes a statement that is hard to back up. Some of this, as is often the case with academics dipping a toe into popular science, was on historical matters - we are told 'It was will into the Middle Ages before the spherical view of the world became the predominant theory.' This just isn't true. I think he is also wrong about the millennium bug, calling it a self-defeating impact from predictions. The idea is that because of all the effort that was put in, there were few big problems, so people thought it was overhyped. I was consulting for the IT department of a global company at the time, and the reality was far more nuanced - the analysis was that it genuinely was overhyped, in that far too much was spent on checking non-critical systems that can have failed relatively painlessly, where a more effective approach would have been only to check mission- or safety-critical systems and leave the rest to fail and be fixed if necessary. 

On other occasions, Yates provides a lack of explanation. For example, he introduces Benford's law, without telling us why it occurs. Some of the material was a little dull - I was particularly disappointed with the chapter on game theory, which failed to capture the intriguing nature of the subject and didn't explain enough for the reader to get their head around what was going on. Bearing in mind a lot of the book is based on psychological research, I was really surprised there was no mention of the replication crisis (surely in itself demonstrating a glaring lack of ability to predict the future) - I would be surprised if some of the studies he cites haven't failed to be capable of reproduction, or weren't based on far too small a sample to be meaningful. At the very least, this should be discussed in a book based on such studies.

The linearity bias isn't the only one that Yates covers, though most of the ones mentioned tie into it. As is always the case with books like this, it proved very interesting to read about, but I very rapidly forgot what all the biases are (again), and found it difficult to think of practical applications of what I've read. It's fine if you are a business or government wanting to deal with uncertainty (though even there, the book isn't a practical guide), but I think it's very unlikely to make much difference to the way we go about making predictions about the future in our everyday lives, beyond 'don't bother'.

Overall, this is an interesting topic and Yates presents a novel approach and does a good job of getting the reader to appreciate the dangers of relying on linearity. The book does have a few issues, but is still well worth a read.

Hardback:   
Kindle 
Using these links earns us commission at no cost to you
Review by Brian Clegg - See all Brian's online articles or subscribe to a weekly email free here

Comments

Popular posts from this blog

Roger Highfield - Stephen Hawking: genius at work interview

Roger Highfield OBE is the Science Director of the Science Museum Group. Roger has visiting professorships at the Department of Chemistry, UCL, and at the Dunn School, University of Oxford, is a Fellow of the Academy of Medical Sciences, and a member of the Medical Research Council and Longitude Committee. He has written or co-authored ten popular science books, including two bestsellers. His latest title is Stephen Hawking: genius at work . Why science? There are three answers to this question, depending on context: Apollo; Prime Minister Margaret Thatcher, along with the world’s worst nuclear accident at Chernobyl; and, finally, Nullius in verba . Growing up I enjoyed the sciencey side of TV programmes like Thunderbirds and The Avengers but became completely besotted when, in short trousers, I gazed up at the moon knowing that two astronauts had paid it a visit. As the Apollo programme unfolded, I became utterly obsessed. Today, more than half a century later, the moon landings are

Splinters of Infinity - Mark Wolverton ****

Many of us who read popular science regularly will be aware of the 'great debate' between American astronomers Harlow Shapley and Heber Curtis in 1920 over whether the universe was a single galaxy or many. Less familiar is the clash in the 1930s between American Nobel Prize winners Robert Millikan and Arthur Compton over the nature of cosmic rays. This not a book about the nature of cosmic rays as we now understand them, but rather explores this confrontation between heavyweight scientists. Millikan was the first in the fray, and often wrongly named in the press as discoverer of cosmic rays. He believed that this high energy radiation from above was made up of photons that ionised atoms in the atmosphere. One of the reasons he was determined that they should be photons was that this fitted with his thesis that the universe was in a constant state of creation: these photons, he thought, were produced in the birth of new atoms. This view seems to have been primarily driven by re

Deep Utopia - Nick Bostrom ***

This is one of the strangest sort-of popular science (or philosophy, or something or other) books I've ever read. If you can picture the impact of a cross between Douglas Hofstadter's  Gödel Escher Bach and Gaileo's Two New Sciences  (at least, its conversational structure), then thrown in a touch of David Foster Wallace's Infinite Jest , and you can get a feel for what the experience of reading it is like - bewildering with the feeling that there is something deep that you can never quite extract from it. Oxford philosopher Nick Bostrom is probably best known in popular science for his book Superintelligence in which he looked at the implications of having artificial intelligence (AI) that goes beyond human capabilities. In a sense, Deep Utopia is a sequel, picking out one aspect of this speculation: what life would be like for us if technology had solved all our existential problems, while (in the form of superintelligence) it had also taken away much of our appare