Skip to main content

A Mind at Play - Jimmy Soni and Rob Goodman ****

If you are familiar with the history of computing, there are a few names that you'll know well enough biographically to turn them into real people. Babbage and Lovelace, Turing and von Neumann, Gates and Jobs. But there's one of the greats who may conjure up nothing more than a name - Claude Shannon. If Jimmy Soni and Rob Goodman get this right, we're going to get to know him a lot better - and get a grip on his information theory, which sounds simple in principle, but can be difficult to get your head around.

If you haven't heard of Claude Shannon, you ought to have. He was responsible for two key parts of the theoretical foundations that lie beneath the computing and internet technology most of use everyday. Arguably, without Shannon's theory, for example, it would be impossible to slump down in front of Netflix and watch a video on demand.

I suspect one reason that Shannon's work is less familiar than it should be is that it lies buried deep in the ICT architecture. I was primarily a programmer for a number of years, but as someone writing applications - programs for people to use - I didn't have to give any thought to Shannon's theories. They were embodied by engineers at a lower level than I ever needed to access. In fact, I'm ashamed to say that when I was programming, though I could give you chapter and verse on Bill Gates, I'd never heard of Shannon, even though he was still alive back then.

What Soni and Goodman do really well is to give us a feel for Shannon, the man. The writing has an impressive ability to put is into the home town of Claude Shannon, or the corridors of Bell Labs as he rides his unicycle along them. At first glance, Shannon might seem quite similar to Richard Feynman in his combination of playfulness with amazing insight. But it soon becomes clear that Shannon was a far less likeable character - more introverted, dismissive of those he considered an intellectual inferior and with no real interest in helping his country in the war or with codebreaking, more undertaking this if and only if he could be offered something he found mentally stimulating. Soni and Goodman seem to find his obsession with juggling, unicycles and building strange contraptions endearing, but I'm not sure that's really how it come across.

I am giving this book four stars for the biographical side, which works very well, but there are some issues. One is hyperbole - there is no doubt that Shannon was a genius and made a huge contribution to our understanding of information, but we really don't need to be told how incredible he was quite as often as this book does. At one point he is compared with Einstein - with Einstein arguably coming across as the less significant of the two - this seems to miss that part of Einstein's genius was the breadth of his work from statistical mechanics through relativity to quantum physics. While Shannon's personal interests were broad, his important work lacked that range.

The bigger issue was that I had hoped for a scientific biography, but I only really got a biography with a bit of science thrown in. The coverage of Shannon's information theory was (ironically) rarely very informative. I would have loved to have had the same level of exploration of the theory as we get of the person - but it's just not there. Of course, the theory isn't ignored, with a few pages given to each of the two big breakthroughs - but there could have been a whole lot more to make what can be a difficult concept more accessible.

I ought to stress that using the term hyperbole should not in any sense reduce the importance of Shannon's work. Hearing of Shannon's initial inspiration that logic and electrical circuitry were equivalent comes across rather like Darwin (and Wallace)'s inspiration on evolution by natural selection. It appears blindingly obvious, once you are told about it, but it took a long time for anyone to do so - and it's hugely important. Shannon's second big step, which provides a generalised model for information transmission with noise and makes the whole understanding of information communication mathematical was inspirational and up there with Turing's universal computer. What's more, it has applications well outside the IT world in the way it provides a link between information and entropy. If there were a maths Nobel prize, as Soni and Goodman suggest, Shannon definitely should have won one.

This is a man we needed to find out more about - and we certainly do. I just wish there had been more detail of the science in there too.

Hardback:  

Audio CD:  
Using these links earns us commission at no cost to you


Review by Brian Clegg

Comments

Popular posts from this blog

David Spiegelhalter Five Way interview

Professor Sir David Spiegelhalter FRS OBE is Emeritus Professor of Statistics in the Centre for Mathematical Sciences at the University of Cambridge. He was previously Chair of the Winton Centre for Risk and Evidence Communication and has presented the BBC4 documentaries Tails you Win: the Science of Chance, the award-winning Climate Change by Numbers. His bestselling book, The Art of Statistics , was published in March 2019. He was knighted in 2014 for services to medical statistics, was President of the Royal Statistical Society (2017-2018), and became a Non-Executive Director of the UK Statistics Authority in 2020. His latest book is The Art of Uncertainty . Why probability? because I have been fascinated by the idea of probability, and what it might be, for over 50 years. Why is the ‘P’ word missing from the title? That's a good question.  Partly so as not to make it sound like a technical book, but also because I did not want to give the impression that it was yet another book

The Genetic Book of the Dead: Richard Dawkins ****

When someone came up with the title for this book they were probably thinking deep cultural echoes - I suspect I'm not the only Robert Rankin fan in whom it raised a smile instead, thinking of The Suburban Book of the Dead . That aside, this is a glossy and engaging book showing how physical makeup (phenotype), behaviour and more tell us about the past, with the messenger being (inevitably, this being Richard Dawkins) the genes. Worthy of comment straight away are the illustrations - this is one of the best illustrated science books I've ever come across. Generally illustrations are either an afterthought, or the book is heavily illustrated and the text is really just an accompaniment to the pictures. Here the full colour images tie in directly to the text. They are not asides, but are 'read' with the text by placing them strategically so the picture is directly with the text that refers to it. Many are photographs, though some are effective paintings by Jana Lenzová. T

Everything is Predictable - Tom Chivers *****

There's a stereotype of computer users: Mac users are creative and cool, while PC users are businesslike and unimaginative. Less well-known is that the world of statistics has an equivalent division. Bayesians are the Mac users of the stats world, where frequentists are the PC people. This book sets out to show why Bayesians are not just cool, but also mostly right. Tom Chivers does an excellent job of giving us some historical background, then dives into two key aspects of the use of statistics. These are in science, where the standard approach is frequentist and Bayes only creeps into a few specific applications, such as the accuracy of medical tests, and in decision theory where Bayes is dominant. If this all sounds very dry and unexciting, it's quite the reverse. I admit, I love probability and statistics, and I am something of a closet Bayesian*), but Chivers' light and entertaining style means that what could have been the mathematical equivalent of debating angels on