Skip to main content

A Mind at Play - Jimmy Soni and Rob Goodman ****

If you are familiar with the history of computing, there are a few names that you'll know well enough biographically to turn them into real people. Babbage and Lovelace, Turing and von Neumann, Gates and Jobs. But there's one of the greats who may conjure up nothing more than a name - Claude Shannon. If Jimmy Soni and Rob Goodman get this right, we're going to get to know him a lot better - and get a grip on his information theory, which sounds simple in principle, but can be difficult to get your head around.

If you haven't heard of Claude Shannon, you ought to have. He was responsible for two key parts of the theoretical foundations that lie beneath the computing and internet technology most of use everyday. Arguably, without Shannon's theory, for example, it would be impossible to slump down in front of Netflix and watch a video on demand.

I suspect one reason that Shannon's work is less familiar than it should be is that it lies buried deep in the ICT architecture. I was primarily a programmer for a number of years, but as someone writing applications - programs for people to use - I didn't have to give any thought to Shannon's theories. They were embodied by engineers at a lower level than I ever needed to access. In fact, I'm ashamed to say that when I was programming, though I could give you chapter and verse on Bill Gates, I'd never heard of Shannon, even though he was still alive back then.

What Soni and Goodman do really well is to give us a feel for Shannon, the man. The writing has an impressive ability to put is into the home town of Claude Shannon, or the corridors of Bell Labs as he rides his unicycle along them. At first glance, Shannon might seem quite similar to Richard Feynman in his combination of playfulness with amazing insight. But it soon becomes clear that Shannon was a far less likeable character - more introverted, dismissive of those he considered an intellectual inferior and with no real interest in helping his country in the war or with codebreaking, more undertaking this if and only if he could be offered something he found mentally stimulating. Soni and Goodman seem to find his obsession with juggling, unicycles and building strange contraptions endearing, but I'm not sure that's really how it come across.

I am giving this book four stars for the biographical side, which works very well, but there are some issues. One is hyperbole - there is no doubt that Shannon was a genius and made a huge contribution to our understanding of information, but we really don't need to be told how incredible he was quite as often as this book does. At one point he is compared with Einstein - with Einstein arguably coming across as the less significant of the two - this seems to miss that part of Einstein's genius was the breadth of his work from statistical mechanics through relativity to quantum physics. While Shannon's personal interests were broad, his important work lacked that range.

The bigger issue was that I had hoped for a scientific biography, but I only really got a biography with a bit of science thrown in. The coverage of Shannon's information theory was (ironically) rarely very informative. I would have loved to have had the same level of exploration of the theory as we get of the person - but it's just not there. Of course, the theory isn't ignored, with a few pages given to each of the two big breakthroughs - but there could have been a whole lot more to make what can be a difficult concept more accessible.

I ought to stress that using the term hyperbole should not in any sense reduce the importance of Shannon's work. Hearing of Shannon's initial inspiration that logic and electrical circuitry were equivalent comes across rather like Darwin (and Wallace)'s inspiration on evolution by natural selection. It appears blindingly obvious, once you are told about it, but it took a long time for anyone to do so - and it's hugely important. Shannon's second big step, which provides a generalised model for information transmission with noise and makes the whole understanding of information communication mathematical was inspirational and up there with Turing's universal computer. What's more, it has applications well outside the IT world in the way it provides a link between information and entropy. If there were a maths Nobel prize, as Soni and Goodman suggest, Shannon definitely should have won one.

This is a man we needed to find out more about - and we certainly do. I just wish there had been more detail of the science in there too.

Hardback:  

Audio CD:  
Using these links earns us commission at no cost to you


Review by Brian Clegg

Comments

Popular posts from this blog

A (Very) Short History of Life on Earth - Henry Gee *****

In writing this book, Henry Gee had a lot to live up to. His earlier title  The Accidental Species was a superbly readable and fascinating description of the evolutionary process leading to Homo sapiens . It seemed hard to beat - but he has succeeded with what is inevitably going to be described as a tour-de-force. As is promised on the cover, we are taken through nearly 4.6 billion years of life on Earth (actually rather more, as I'll cover below). It's a mark of Gee's skill that what could have ended up feeling like an interminable list of different organisms comes across instead as something of a pager turner. This is helped by the structuring - within those promised twelve chapters everything is divided up into handy bite-sized chunks. And although there certainly are very many species mentioned as we pass through the years, rather than feeling overwhelming, Gee's friendly prose and careful timing made the approach come across as natural and organic.  There was a w

Michael D. Gordin - Four Way Interview

Michael D. Gordin is a historian of modern science and a professor at Princeton University, with particular interests in the physical sciences and in science in Russia and the Soviet Union. He is the author of six books, ranging from the periodic table to early nuclear weapons to the history of scientific languages. His most recent book is On the Fringe: Where Science Meets Pseudoscience (Oxford University Press). Why history of science? The history of science grabbed me long before I knew that there were actual historians of science out there. I entered college committed to becoming a physicist, drawn in by the deep intellectual puzzles of entropy, quantum theory, and relativity. When I started taking courses, I came to understand that what really interested me about those puzzles were not so much their solutions — still replete with paradoxes — but rather the rich debates and even the dead-ends that scientists had taken to trying to resolve them. At first, I thought this fell under

Regeneration - Paul Hawken **

This is a really big book. I don't mean big in the sense of important, but physically enormous for what it is - it's roughly the size of a children's annual, though a lot thicker. Interestingly, the format appears to be a Paul Hawken speciality - he did it with his previous title, Drawdown ,  though that was far less glossy. Paul Hawken's aim is to put forward a solution to climate change driven from humans rather than from the science. The tag line on the back of the book reads 'The climate crisis is not at science problem. It is a human problem.' And that itself is a problem. It's not that climate change isn't a human problem, but rather that it's both a human problem and a science problem - requiring human and science-based solutions. But the approach taken in this book is anything but scientific. It's a bit like saying the Covid-19 pandemic is a human problem, not a science problem. The pandemic is indeed a human problem, but if we'd tr