Skip to main content

The Demon in the Machine - Paul Davies *****

Physicists have a habit of dabbling in biology and, perhaps surprisingly, biologists tend to be quite tolerant of it. (I find it hard to believe the reverse would be true if biologists tried to do physics.) Perhaps one reason for that tolerance is Schrödinger’s lecture series and book What is Life?, which had a huge impact on molecular biology and with a reference to which, not surprisingly, Paul Davies begins his fascinating book. 

At the heart of the The Demon in the Machine (we'll come back to that demon in a moment) is the relationship between life and information. In essence, Davies points out that if we try to reduce life to its simple physical components it is like trying to work with a computer that has no software. The equivalent of software here is information, not just in the best publicised aspect of the information stored in the DNA, but on a far broader scale, operating in networks across the organism.

This information and its processing gives life its emergent complexity, which is why, Davies suggests, Dawkins-style reductionism to the gene level entirely misses the point. What's more, the biological setup provides a particularly sophisticated relationship between information and the physical aspects of the organism because the information can modify itself - it's as if a computer program could redesign itself as it went along.

The subtitle 'how hidden webs of information are solving the mystery of life' probably over-promises. As Davies makes clear, we still have no idea how life came into being in the first place. However, by bringing in this physical/information aspect we at least can get a better grip on the workings of the molecular machines inside organisms and how biology can do so much with so little. Here's where the demon in the title comes in. This is Maxwell's demon, the hypothetical miniature being dreamed up by the great nineteenth century Scottish physicist.

Maxwell's demon has the remarkable ability to tweak the second law of thermodynamics allowing, for example, heat to flow from a colder to a hotter body or, to put it another way, providing a mechanism for entropy (the measure of disorder in a system) to spontaneously decrease. Entropy has a strong (negative) relationship with information and Davies shows how miniature biological systems act in a demon-like fashion to effectively manage information.

There's lots to like here, from the best explanation I've seen of the relationship of information and entropy to fascinating coverage of how far we’ve gone beyond the selfish gene. This is not just about basic epigenetic processes (operating outside of genes, switching them on and off and so on) but how, for example, the electric field of a (biological) cell apparently has a role to play in ‘sculpting‘ the physical structure of an organism.

My only real complaint is that in part of the chapter Enter the Demon dealing with information engines and most of the chapter The Logic of Life, describing the relationship between living organisms and computation, Davies fails to put across clearly just what is going on. I read it, but didn't feel I gained as much information (ironically) as I needed from it. There was also one very odd statistic. We're told the information in a strand of DNA contains 'about 2 billion bits - more than the information contained in all the books in the Library of Congress.' There are about 32 million books in the Library of Congress, so that gives us on average 62.5 bits per book. Unless those are very short books, some information has gone astray.

Really interesting, then, from a transformed understanding of the importance of information in living organisms through to Davies' speculation on whether biological systems need new physical laws to describe them. But expect to come away feeling you need to read it again to be sure what it said.
Hardback 

Kindle 
Using these links earns us commission at no cost to you
Review by Brian Clegg

Comments

  1. Nick lane argues in his book The Vital Question that biological systems are not really working to decrease entropy they exist because they essentially dont violate the second law of thermodynamics . I wonder how that reconciles with the theory here

    ReplyDelete
  2. I find it difficult to follow some of this stuff. Not the fluffy conceptual models, but the simple aspergers stuff.

    eg p95, text describing Fig 11, says 'barred lines [indicate inhibition]'. Then seems to use a different term - 'loopy broken arrows' to denote self inhibition. Change of term is confusing but not fatal.

    But Fig 11 contains no broken or barred loopy arrows, only solid ones, thus indicating self activation, not self inhibition. Again, I guess, not fatal, if the reader is expected to be alert for errors or misprints.

    But table 2, P96 shows that in at least 2 cases, the loop is self suppression, so the loop should be broken or barred. Still not quite fatal.

    But then, following Table 2 and Fig 11 together, node G suddenly gets activated in step 8, although in Fig 11 there is no 'incoming [activation] arrow' pointing at it, except for it's own self activation loop, which can never be initiated.

    So what's going on? AFAIK just errors, which I'd hope not to have been included in this book.

    So, how do we read this book? Perhaps best just to gloss over all the clever diagrams & tables, say 'Gee, Whizz, that's all amazing!'

    OTOH if I'm really not understanding a perfectly valid exposition, I'd love to know.

    ReplyDelete

Post a Comment

Popular posts from this blog

David Spiegelhalter Five Way interview

Professor Sir David Spiegelhalter FRS OBE is Emeritus Professor of Statistics in the Centre for Mathematical Sciences at the University of Cambridge. He was previously Chair of the Winton Centre for Risk and Evidence Communication and has presented the BBC4 documentaries Tails you Win: the Science of Chance, the award-winning Climate Change by Numbers. His bestselling book, The Art of Statistics , was published in March 2019. He was knighted in 2014 for services to medical statistics, was President of the Royal Statistical Society (2017-2018), and became a Non-Executive Director of the UK Statistics Authority in 2020. His latest book is The Art of Uncertainty . Why probability? because I have been fascinated by the idea of probability, and what it might be, for over 50 years. Why is the ‘P’ word missing from the title? That's a good question.  Partly so as not to make it sound like a technical book, but also because I did not want to give the impression that it was yet another book

Vector - Robyn Arianrhod ****

This is a remarkable book for the right audience (more on that in a moment), but one that's hard to classify. It's part history of science/maths, part popular maths and even has a smidgen of textbook about it, as it has more full-on mathematical content that a typical title for the general public usually has. What Robyn Arianrhod does in painstaking detail is to record the development of the concept of vectors, vector calculus and their big cousin tensors. These are mathematical tools that would become crucial for physics, not to mention more recently, for example, in the more exotic aspects of computing. Let's get the audience thing out of the way. Early on in the book we get a sentence beginning ‘You likely first learned integral calculus by…’ The assumption is very much that the reader already knows the basics of maths at least to A-level (level to start an undergraduate degree in a 'hard' science or maths) and has no problem with practical use of calculus. Altho

Everything is Predictable - Tom Chivers *****

There's a stereotype of computer users: Mac users are creative and cool, while PC users are businesslike and unimaginative. Less well-known is that the world of statistics has an equivalent division. Bayesians are the Mac users of the stats world, where frequentists are the PC people. This book sets out to show why Bayesians are not just cool, but also mostly right. Tom Chivers does an excellent job of giving us some historical background, then dives into two key aspects of the use of statistics. These are in science, where the standard approach is frequentist and Bayes only creeps into a few specific applications, such as the accuracy of medical tests, and in decision theory where Bayes is dominant. If this all sounds very dry and unexciting, it's quite the reverse. I admit, I love probability and statistics, and I am something of a closet Bayesian*), but Chivers' light and entertaining style means that what could have been the mathematical equivalent of debating angels on