Skip to main content

A Mind at Play - Jimmy Soni and Rob Goodman ****

If you are familiar with the history of computing, there are a few names that you'll know well enough biographically to turn them into real people. Babbage and Lovelace, Turing and von Neumann, Gates and Jobs. But there's one of the greats who may conjure up nothing more than a name - Claude Shannon. If Jimmy Soni and Rob Goodman get this right, we're going to get to know him a lot better - and get a grip on his information theory, which sounds simple in principle, but can be difficult to get your head around.

If you haven't heard of Claude Shannon, you ought to have. He was responsible for two key parts of the theoretical foundations that lie beneath the computing and internet technology most of use everyday. Arguably, without Shannon's theory, for example, it would be impossible to slump down in front of Netflix and watch a video on demand.

I suspect one reason that Shannon's work is less familiar than it should be is that it lies buried deep in the ICT architecture. I was primarily a programmer for a number of years, but as someone writing applications - programs for people to use - I didn't have to give any thought to Shannon's theories. They were embodied by engineers at a lower level than I ever needed to access. In fact, I'm ashamed to say that when I was programming, though I could give you chapter and verse on Bill Gates, I'd never heard of Shannon, even though he was still alive back then.

What Soni and Goodman do really well is to give us a feel for Shannon, the man. The writing has an impressive ability to put is into the home town of Claude Shannon, or the corridors of Bell Labs as he rides his unicycle along them. At first glance, Shannon might seem quite similar to Richard Feynman in his combination of playfulness with amazing insight. But it soon becomes clear that Shannon was a far less likeable character - more introverted, dismissive of those he considered an intellectual inferior and with no real interest in helping his country in the war or with codebreaking, more undertaking this if and only if he could be offered something he found mentally stimulating. Soni and Goodman seem to find his obsession with juggling, unicycles and building strange contraptions endearing, but I'm not sure that's really how it come across.

I am giving this book four stars for the biographical side, which works very well, but there are some issues. One is hyperbole - there is no doubt that Shannon was a genius and made a huge contribution to our understanding of information, but we really don't need to be told how incredible he was quite as often as this book does. At one point he is compared with Einstein - with Einstein arguably coming across as the less significant of the two - this seems to miss that part of Einstein's genius was the breadth of his work from statistical mechanics through relativity to quantum physics. While Shannon's personal interests were broad, his important work lacked that range.

The bigger issue was that I had hoped for a scientific biography, but I only really got a biography with a bit of science thrown in. The coverage of Shannon's information theory was (ironically) rarely very informative. I would have loved to have had the same level of exploration of the theory as we get of the person - but it's just not there. Of course, the theory isn't ignored, with a few pages given to each of the two big breakthroughs - but there could have been a whole lot more to make what can be a difficult concept more accessible.

I ought to stress that using the term hyperbole should not in any sense reduce the importance of Shannon's work. Hearing of Shannon's initial inspiration that logic and electrical circuitry were equivalent comes across rather like Darwin (and Wallace)'s inspiration on evolution by natural selection. It appears blindingly obvious, once you are told about it, but it took a long time for anyone to do so - and it's hugely important. Shannon's second big step, which provides a generalised model for information transmission with noise and makes the whole understanding of information communication mathematical was inspirational and up there with Turing's universal computer. What's more, it has applications well outside the IT world in the way it provides a link between information and entropy. If there were a maths Nobel prize, as Soni and Goodman suggest, Shannon definitely should have won one.

This is a man we needed to find out more about - and we certainly do. I just wish there had been more detail of the science in there too.

Hardback:  

Audio CD:  
Using these links earns us commission at no cost to you


Review by Brian Clegg

Comments

Popular posts from this blog

The Laws of Thought - Tom Griffiths *****

In giving us a history of attempts to explain our thinking abilities, Tom Griffiths demonstrates an excellent ability to pitch information just right for the informed general reader.  We begin with Aristotelian logic and the way Boole and others transformed it into a kind of arithmetic before a first introduction of computing and theories of language. Griffiths covers a surprising amount of ground - we don't just get, for instance, the obvious figures of Turing, von Neumann and Shannon, but the interaction between the computing pioneers and those concerned with trying to understand the way we think - for example in the work of Jerome Bruner, of whom I confess I'd never heard.  This would prove to be the case with a whole host of people who have made interesting contributions to the understanding of human thought processes. Sometimes their theories were contradictory - this isn't an easy field to successfully observe - but always they were interesting. But for me, at least, ...

The Infinity Machine - Sebastian Mallaby ****

It's very quickly clear that Sebastian Mallaby is a huge Demis Hassabis fan - writing about the only child prodigy and teen genius ever who was also a nice, rounded personality. After a few chapters, though, things settle down (I'm reminded of Douglas Adams' description of the Hitchhiker's Guide to the Galaxy ) and we get a good, solid trip through the journey that gave us DeepMind, their AlphaGo and AlphaFold programs, the sudden explosion of competition on the AI front and thoughts on artificial general intelligence. Although Mallaby does occasionally still go into fan mode - reading this you would think that AlphaFold had successfully perfectly predicted the structure of every protein, where it is usually not sufficiently accurate for its results to have direct practical application - we get a real feel for the way this relatively unusual company was swiftly and successfully developed away from Silicon Valley. It's readable and gives an important understanding of...

Nanotechnology - Rahul Rao ****

There was a time when nanotechnology was both going to transform the world and wipe us out - a similar position to our view of AI today. On the positive transformation side there was K. Eric Drexler's visions in the 1986 Engines of Creation. Arguably as much science fiction as engineering possibilities, it predicted the ability to use vast armies of assemblers to put objects together from individual atoms.  On the negative side was the vision of grey goo, out of control nanotechnology consuming all in its path as it made more and more copies of itself. In 2003, for instance, the then Prince Charles made the headlines  when newspapers reported ‘The prince has raised the spectre of the “grey goo” catastrophe in which sub-microscopic machines designed to share intelligence and replicate themselves take over and devour the planet.’ These days the expectations have been eased down a notch or two. Where nanotechnology has succeeded, it has been with the likes of atom-thick mat...