Skip to main content

A Mind at Play - Jimmy Soni and Rob Goodman ****

If you are familiar with the history of computing, there are a few names that you'll know well enough biographically to turn them into real people. Babbage and Lovelace, Turing and von Neumann, Gates and Jobs. But there's one of the greats who may conjure up nothing more than a name - Claude Shannon. If Jimmy Soni and Rob Goodman get this right, we're going to get to know him a lot better - and get a grip on his information theory, which sounds simple in principle, but can be difficult to get your head around.

If you haven't heard of Claude Shannon, you ought to have. He was responsible for two key parts of the theoretical foundations that lie beneath the computing and internet technology most of use everyday. Arguably, without Shannon's theory, for example, it would be impossible to slump down in front of Netflix and watch a video on demand.

I suspect one reason that Shannon's work is less familiar than it should be is that it lies buried deep in the ICT architecture. I was primarily a programmer for a number of years, but as someone writing applications - programs for people to use - I didn't have to give any thought to Shannon's theories. They were embodied by engineers at a lower level than I ever needed to access. In fact, I'm ashamed to say that when I was programming, though I could give you chapter and verse on Bill Gates, I'd never heard of Shannon, even though he was still alive back then.

What Soni and Goodman do really well is to give us a feel for Shannon, the man. The writing has an impressive ability to put is into the home town of Claude Shannon, or the corridors of Bell Labs as he rides his unicycle along them. At first glance, Shannon might seem quite similar to Richard Feynman in his combination of playfulness with amazing insight. But it soon becomes clear that Shannon was a far less likeable character - more introverted, dismissive of those he considered an intellectual inferior and with no real interest in helping his country in the war or with codebreaking, more undertaking this if and only if he could be offered something he found mentally stimulating. Soni and Goodman seem to find his obsession with juggling, unicycles and building strange contraptions endearing, but I'm not sure that's really how it come across.

I am giving this book four stars for the biographical side, which works very well, but there are some issues. One is hyperbole - there is no doubt that Shannon was a genius and made a huge contribution to our understanding of information, but we really don't need to be told how incredible he was quite as often as this book does. At one point he is compared with Einstein - with Einstein arguably coming across as the less significant of the two - this seems to miss that part of Einstein's genius was the breadth of his work from statistical mechanics through relativity to quantum physics. While Shannon's personal interests were broad, his important work lacked that range.

The bigger issue was that I had hoped for a scientific biography, but I only really got a biography with a bit of science thrown in. The coverage of Shannon's information theory was (ironically) rarely very informative. I would have loved to have had the same level of exploration of the theory as we get of the person - but it's just not there. Of course, the theory isn't ignored, with a few pages given to each of the two big breakthroughs - but there could have been a whole lot more to make what can be a difficult concept more accessible.

I ought to stress that using the term hyperbole should not in any sense reduce the importance of Shannon's work. Hearing of Shannon's initial inspiration that logic and electrical circuitry were equivalent comes across rather like Darwin (and Wallace)'s inspiration on evolution by natural selection. It appears blindingly obvious, once you are told about it, but it took a long time for anyone to do so - and it's hugely important. Shannon's second big step, which provides a generalised model for information transmission with noise and makes the whole understanding of information communication mathematical was inspirational and up there with Turing's universal computer. What's more, it has applications well outside the IT world in the way it provides a link between information and entropy. If there were a maths Nobel prize, as Soni and Goodman suggest, Shannon definitely should have won one.

This is a man we needed to find out more about - and we certainly do. I just wish there had been more detail of the science in there too.

Hardback:  

Audio CD:  
Using these links earns us commission at no cost to you


Review by Brian Clegg

Comments

Popular posts from this blog

The AI Paradox - Virginia Dignum ****

This is a really important book in the way that Virginia Dignum highlights various ways we can misunderstand AI and its abilities using a series of paradoxes. However, I need to say up front that I'm giving it four stars for the ideas: unfortunately the writing is not great. It reads more like a government report than anything vaguely readable - it really should have co-authored with a professional writer to make it accessible. Even so, I'm recommending it: like some government reports it's significant enough to make it necessary to wade through the bureaucrat speak. Why paradoxes? Dignum identifies two ways we can think about paradoxes (oddly I wrote about paradoxes recently , but with three definitions): a logical paradox such as 'this statement is false', or a paradoxical truth such as 'less is more' - the second of which seems a better to fit to the use here.  We are then presented with eight paradoxes, each of which gives some insights into aspects of t...

The Laws of Thought - Tom Griffiths *****

In giving us a history of attempts to explain our thinking abilities, Tom Griffiths demonstrates an excellent ability to pitch information just right for the informed general reader.  We begin with Aristotelian logic and the way Boole and others transformed it into a kind of arithmetic before a first introduction of computing and theories of language. Griffiths covers a surprising amount of ground - we don't just get, for instance, the obvious figures of Turing, von Neumann and Shannon, but the interaction between the computing pioneers and those concerned with trying to understand the way we think - for example in the work of Jerome Bruner, of whom I confess I'd never heard.  This would prove to be the case with a whole host of people who have made interesting contributions to the understanding of human thought processes. Sometimes their theories were contradictory - this isn't an easy field to successfully observe - but always they were interesting. But for me, at least, ...

Einstein's Fridge - Paul Sen ****

In Einstein's Fridge (interesting factoid: this is at least the third popular science book to be named after Einstein's not particularly exciting refrigerator), Paul Sen has taken on a scary challenge. As Jim Al-Khalili made clear in his excellent The World According to Physics , our physical understanding of reality rests on three pillars: relativity, quantum theory and thermodynamics. But there is no doubt that the third of these, the topic of Sen's book, is a hard sell. While it's true that these are the three pillars of physics, from the point of view of making interesting popular science, the first two might be considered pillars of gold and platinum, while the third is a pillar of salt. Relativity and quantum theory are very much of the twentieth century. They are exciting and sometimes downright weird and wonderful. Thermodynamics, by contrast, has a very Victorian feel and, well, is uninspiring. Luckily, though, thermodynamics is important enough, lying behind ...