Skip to main content

Infinitesimal - Amir Alexander ***

While some books have obscure titles, a combination of the title and the subtitle will usually make it plain what the book is about. But I can pretty much guarantee that most readers, seeing Infinitesimal - how a dangerous mathematical theory shaped the modern world would leap to an incorrect conclusion as I did. The dangerous aspect of infinitesimals was surely going to be related in some way to calculus, but I expected it to be about the great priority debate between Newton and Leibniz, where in fact the book concentrates on the precursors to their work that would make the use of infinitesimals - quantities that are vanishingly close to zero - acceptable in mathematics.

The book is in two distinct sections. The first focuses on the history of the Jesuits, from their founding to their weighing into the mathematical debate against those who wanted to use infinitesimals in maths. For the Jesuits, everything was cut and dried, and where Aristotle's view and the geometry of Euclid had an unchanging nature that made them acceptable, the use of infinitesimals was far too redolent of change and rebellion. This was interesting, particularly in the way that the history gave background on Galileo's rise and fall seen from a different viewpoint (as he was in the ascendancy, the Jesuits were temporarily losing power, and vice versa). However, this part goes on far too long and says the same thing pretty much over and over again. This is, I can't help but feel, a fairly small book, trying to look bigger and more important than it is by being padded.

The second section I found considerably more interesting, though this was mostly as a pure history text. I was fairly ignorant about the origins of the civil war and the impact of its outcome, and Amir Alexander lays this out well. He also portrays the mental battle between philosopher Thomas Hobbes and mathematician John Wallis in a very interesting fashion. I knew, for example, that Wallis had been the first to use the lemniscate, the symbol for infinity used in calculus, but wasn't aware how much he was a self-taught mathematician who took an approach to maths that would horrify any modern maths professional, treating it more as an experimental science where induction was key, than a pure discipline where everything has to be proved.

Hobbes, I only really knew as a name, associated with that horrible frontispiece of his 'masterpiece' Leviathan, which seems to the modern eye a work of madness, envisaging a state where the monarch's word is so supreme that the people are more like automata, cells in a body or bees in a hive rather than individual, thinking humans. What I hadn't realised is that Hobbes was also an enthusiastic mathematician who believed it was possible to derive all his philosophy from geometry - and geometry alone, with none of Wallis' cheating little infinitesimals. The pair attacked each other in print for many years, though Hobbes' campaign foundered to some extent on his inability to see that geometry was not capable of everything (he regularly claimed he had worked out how to square the circle, a geometrically impossible task).

Although I enjoyed finding out more about the historical context it's perhaps unfortunate that Alexander is a historian, rather than someone with an eye to modern science, as I felt the first two sections, which effectively described the winning of the war by induction and experimentation over a view that expected mathematics to be a pure predictor of reality, would have benefited hugely from being contrasted with modern physics, where some would argue that far too much depends on starting with mathematics and predicting outcomes, rather than starting with observation and experiment. An interesting book without doubt, but not quite what it could have been.


Hardback 

Kindle 
Using these links earns us commission at no cost to you
Review by Brian Clegg

Comments

Popular posts from this blog

Models of the Mind - Grace Lindsay *****

This is a remarkable book. When Ernest Rutherford made his infamous remark about science being either physics or stamp collecting, it was, of course, an exaggeration. Yet it was based on a point - biology in particular was primarily about collecting information on what happened rather than explaining at a fundamental level why it happened. This book shows how biologists, in collaboration with physicists, mathematicians and computer scientists, have moved on the science of the brain to model some of its underlying mechanisms. Grace Lindsay is careful to emphasise the very real difference between physical and biological problems. Most systems studied by physics are a lot simpler than biological systems, making it easier to make effective mathematical and computational models. But despite this, huge progress has been made drawing on tools and techniques developed for physics and computing to get a better picture of the mechanisms of the brain. In the book we see this from two directions

The Ten Equations that Rule the World - David Sumpter ****

David Sumpter makes it clear in this book that a couple of handfuls of equations have a huge influence on our everyday lives. I needed an equation too to give this book a star rating - I’ve never had one where there was such a divergence of feeling about it. I wanted to give it five stars for the exposition of the power and importance of these equations and just two stars for an aspect of the way that Sumpter did it. The fact that the outcome of applying my star balancing equation was four stars emphasises how good the content is. What we have here is ten key equations from applied mathematics. (Strictly, nine, as the tenth isn’t really an equation, it’s the programmer’s favourite ‘If… then…’ - though as a programmer I was always more an ‘If… then… else…’ fan.) Those equations range from the magnificent one behind Bayesian statistics and the predictive power of logistic regression to the method of determining confidence intervals and the kind of influencer matrix so beloved of social m

How to Read Numbers - Tom Chivers and David Chivers *****

This is one of my favourite kinds of book - it takes on the way statistics are presented to us, points out flaws and pitfalls, and gives clear guidance on how to do it better. The Chivers brothers' book isn't particularly new in doing this - for example, Michael Blastland and Andrew Dilnot did something similar in the excellent 2007 title The Tiger that Isn't - but it's good to have an up-to-date take on the subject, and How to Read Numbers gives us both some excellent new examples and highlights errors that are more common now. The relatively slim title (and that's a good thing) takes the reader through a whole host of things that can go wrong. So, for example, they explore the dangers of anecdotal evidence, tell of study samples that are too small or badly selected, explore the easily misunderstood meaning of 'statistical significance', consider confounders, effect size, absolute versus relative risk, rankings, cherry picking and more. This is all done i