Skip to main content

Entropy - James Binney ****

Most people are familiar with energy, both in its everyday sense as a resource used for things like heating and powering electrical equipment, and – if they were paying attention in school – as a fundamental property of the physical world that can be converted from one form into another, but never created or destroyed. Entropy, the subject of this book, is a similar but much less well known fundamental physical property. As James Binney says in the first chapter, ‘most people without a degree in physics or chemistry will not have heard of entropy, and probably few of those with relevant degrees could explain what entropy is with any clarity’.

It’s true that entropy is one of the more difficult concepts in physics, at least as regards its precise scientific meaning. At an intuitive level, however, even people who have never heard the word are probably aware of it. If you’ve ever had a sneaking suspicion that energy might not actually be conserved in the way science teachers say it is – because in your experience things are always running down and wearing out – then that’s a consequence of entropy. In simple terms, it can be thought of as a measure of how much ‘useless’ energy there is in a closed system, in the sense of energy that can’t be exploited to any practical purpose. And entropy, unlike energy, isn’t conserved – it can increase, but it can never decrease. If conservation of energy is the good news, then entropy is the bad news. As Binney points out more than once in the book, energy is free – it’s in abundant supply all around us. What costs money, and causes so many headaches for politicians and the public alike, is low-entropy energy that can actually be used for something.

Rewinding to the end of the first paragraph, you may spot something a little odd. If James Binney knows there are so few people that have even heard of entropy, why has he used it as a single-word title of this book? The answer is quite simple: it’s part of OUP’s long-running ‘Very Short Introduction’ series, which operates within well-established constraints – one of these being that the title is always a straight description of the subject matter. As to the style of the books themselves, they are indeed short – this one is only a little over a hundred small pages – and they are ‘introductions’, in the sense of assuming the reader starts with zero knowledge of the subject. As a rule, though, they’re aimed at serious readers who need a quick but thorough-going crash course – whether that’s to pass an exam or for a work-related project – rather than general readers looking for a few entertaining and memorable highlights. In the context of a science subject like this one, the result isn’t really what you would understand by the term ‘popular science’, and was never meant to be.

As regards this specific book, casual readers will find it bombards them with far more technical detail than they wanted to know about the mathematical definitions of entropy, its relationship to other areas of physics, and the way that scientists and engineers use it in their everyday work. Even so, there are some fascinating insights, particularly on the historical development of the subject – with various eminent scientists spending several decades of the nineteenth century trying to unscramble their understanding of energy and entropy – as well as its relationship to cutting-edge areas of modern physics such as quantum entanglement and black holes. There are also a few interesting examples – though not as many as I would have liked – of what might be called ‘everyday’ applications of entropy, for example in heat pumps and image processing.

The very title of this book, and the fact that it’s part of this particular OUP series, means that in practice anyone who buys it will have a pretty good idea what they’re letting themselves in for. It’s these readers that my 4-star rating is aimed at. On the other hand, if it happened to come into the hands of a more general popular science reader, they’re likely to find far more technical detail than they are used to. If they’re happy to skip over this, however, they’ll still find plenty of interesting things that aren’t widely covered elsewhere.

Paperback:   
Kindle 
Using these links earns us commission at no cost to you
These articles will always be free - but if you'd like to support my online work, consider buying a virtual coffee or taking out a membership:
Review by Andrew May - See all Brian's online articles or subscribe to a weekly email free here

Comments

Popular posts from this blog

It's On You - Nick Chater and George Loewenstein *****

Going on the cover you might think this was a political polemic - and admittedly there's an element of that - but the reason it's so good is quite different. It shows how behavioural economics and social psychology have led us astray by putting the focus way too much on individuals. A particular target is the concept of nudges which (as described in Brainjacking ) have been hugely over-rated. But overall the key problem ties to another psychological concept: framing. Huge kudos to both Nick Chater and George Loewenstein - a behavioural scientist and an economics and psychology professor - for having the guts to take on the flaws in their own earlier work and that of colleagues, because they make clear just how limited and potentially dangerous is the belief that individuals changing their behaviour can solve large-scale problems. The main thesis of the book is that there are two ways to approach the major problems we face - an 'i-frame' where we focus on the individual ...

Introducing Artificial Intelligence – Henry Brighton & Howard Selina ****

It is almost impossible to rate these relentlessly hip books – they are pure marmite*. The huge  Introducing  … series (a vast range of books covering everything from Quantum Theory to Islam), previously known as …  for Beginners , puts across the message in a style that owes as much to Terry Gilliam and pop art as it does to popular science. Pretty well every page features large graphics with speech bubbles that are supposed to emphasise the point. Funnily,  Introducing Artificial Intelligence  is both a good and bad example of the series. Let’s get the bad bits out of the way first. The illustrators of these books are very variable, and I didn’t particularly like the pictures here. They did add something – the illustrations in these books always have a lot of information content, rather than being window dressing – but they seemed more detached from the text and rather lacking in the oomph the best versions have. The other real problem is that...

The Laws of Thought - Tom Griffiths *****

In giving us a history of attempts to explain our thinking abilities, Tom Griffiths demonstrates an excellent ability to pitch information just right for the informed general reader.  We begin with Aristotelian logic and the way Boole and others transformed it into a kind of arithmetic before a first introduction of computing and theories of language. Griffiths covers a surprising amount of ground - we don't just get, for instance, the obvious figures of Turing, von Neumann and Shannon, but the interaction between the computing pioneers and those concerned with trying to understand the way we think - for example in the work of Jerome Bruner, of whom I confess I'd never heard.  This would prove to be the case with a whole host of people who have made interesting contributions to the understanding of human thought processes. Sometimes their theories were contradictory - this isn't an easy field to successfully observe - but always they were interesting. But for me, at least, ...