Skip to main content

Writers are human

More like a similar number of bacteria
One of my favourite things about being a science writer is getting emails from readers. Some say nice things about a book, others ask questions about science or writing which I do my best to respond to helpfully. However, every now and then there is an email that's pointing out an error in one of my books - and I have to admit that those are distinctly depressing.

Let's be clear, I can say with reasonable confidence that every book I've ever written has a mistake or two in it. This isn't because I know what all the mistakes are - if I did, I would have corrected them before publication - but rather because pretty well every book I've ever read, fiction or non-fiction, has mistakes in it. It's just a fact of life. A book typically has between 60,000 and 120,000 words in it - the chances of an error slipping through is pretty high.

Many of these are typos. Spelling mistakes, words missing, something that should be spotted in a general edit, but that often slips through. The worst of these can seem glaring when you spot them as a reader, but the reality of proof reading is that it can be very difficult to spot them. Once I'm reading, I can easily get absorbed in the content and miss the detail. Although professional proof readers are excellent, possessing an attention to detail that I could never match, even they miss some.

I have a suspicion that the move of proofs from paper to electronic form has not helped. I don't know if it's been researched in any detail, but there's a widely held belief that we are better at spotting errors on paper than on the screen. Time was when publishers sent out paper proofs. I used to be distinctly nervous when a US publisher would send me proofs across the Atlantic that had already been marked up by a proof reader - if they ever got lost in the mail all that work would have to be done again. I'm afraid there are some publishers now that seem to do hardly any proof reading (not my publishers, by the way), so I do occasionally get sent books littered with typos, but most still do a reasonable job.

However, the errors that are most likely to get my readers excited are factual errors. A historical observation that's wrong or a scientific 'fact' that doesn't hold true. Sometimes this reflects the gap between the book being written and the reader picking it up. Some of my books written, say, 20 years ago are still available. That's great - I'm not complaining. But science moves on. One example from a more recent title - in The Universe Inside You I mention that there are as many as ten times more bacteria than human cells in your body (see snippet above). This was the accepted figure at the time of writing, though it was based on a very vague guesstimate. The guess has now been revised to be a similar number of bacterial and human cells, so I was out by an order of magnitude - but it was not a recognisable error at the time of writing.

Other times I (or others involved in the book production) just make a mistake. Guess what - writers are human. In my biography of James Clerk Maxwell, Professor Maxwell's Duplicitous Demon, there were two interesting errors in this regard. I simply misread a number for the population of the Lancashire mill town of Bolton (I picked up the number from an adjacent line), meaning that the hardback has me incorrectly saying that Aberdeen in 1856 was only half the size of Bolton. (This was particularly embarrassing as it was picked up as an interesting fact in a Scottish newspaper.) By the time the paperback was published, I was able to correct this to say Aberdeen was around the size of Bolton.

The incorrect equations
(bottom right B should be a D)
In the same book, I showed the iconic Maxwell's equations for electromagnetism, in one of the compact forms devised by Oliver Heaviside. I put these in correctly, but equations always have to be reset by the publisher and I didn't notice an error was introduced in doing this. This was pointed out to me in a discussion on my book at the Cheltenham Science Festival. The other contributor was a physicist. I had listened to her mangle a piece of history by attributing a quote to Maxwell that was actually said by someone else, but thought it would cause unnecessary embarrassment to point out the error. She had no such consideration in pointing out that the equation (which was only there to show the beauty of the form) was incorrect.

In other cases, the error could be an oversimplification. In writing up their work, scientists are very careful about being as precise as they can, which can result in lacing any statement with provisos. These don't make for great reading, and part of the job of a science writer is to trim down the provisos without becoming too inaccurate. Interestingly, scientists writing popular science tend to oversimplify more than professional science writers - in part because they like to present their own pet theories, or the most widely accepted theory, as if it were fact. I've lost count of the number of physicists who, for example, say that dark matter exists, rather than is one option to explain some anomalies, alongside modified gravity theories etc.

I have certainly been guilty of this in not always making clear in reporting something said by a historical figure that the saying is disputed or may have been originally been said by someone else. I recently got pulled up for attributing a comment to Lord Kelvin that definitely didn't occur on the occasion it has often been said to have occurred, and may not have been said by Kelvin at all. Yet a number of respectable sources do allocate it to Kelvin - so I feel no shame in using the quote, but should have made the doubts about it clear.

So we writers do make mistakes. But do we want to be told about them? On a personal level, the answer is probably usually 'No.' No one likes being told 'you were wrong' - though a lot can depend on how we are told. An email I recently got about that Kelvin issue had the subject line 'Kelvin, being smug and...' - it feels like a good way to irritate someone. Probably the most important thing about telling an author about an error is timeliness. Please make sure the book was published in the last year or so.

If you point out an error in a book that was published a number of years ago, the chances are you are not the first - having a stream of people pointing out the same mistake is not particularly helpful. More to the point, unless the book is a continuing bestseller, it's probably too late to do anything about it. If it's caught early, the error can be corrected in later editions, but once most copies have already been printed - which for many books is in the first year - it's pretty much a fruitless exercise, all pain and no gain. (Admittedly ebooks can always be updated, but publishers rarely bother.)

To reiterate - we authors love hearing from readers. But bear in mind (again) that we are human. Our books are a bit like our children - we know they have their flaws, but don't necessarily appreciate having them pointed out.

Brian Clegg

Comments

Popular posts from this blog

The Genetic Book of the Dead: Richard Dawkins ****

When someone came up with the title for this book they were probably thinking deep cultural echoes - I suspect I'm not the only Robert Rankin fan in whom it raised a smile instead, thinking of The Suburban Book of the Dead . That aside, this is a glossy and engaging book showing how physical makeup (phenotype), behaviour and more tell us about the past, with the messenger being (inevitably, this being Richard Dawkins) the genes. Worthy of comment straight away are the illustrations - this is one of the best illustrated science books I've ever come across. Generally illustrations are either an afterthought, or the book is heavily illustrated and the text is really just an accompaniment to the pictures. Here the full colour images tie in directly to the text. They are not asides, but are 'read' with the text by placing them strategically so the picture is directly with the text that refers to it. Many are photographs, though some are effective paintings by Jana Lenzová. T

David Spiegelhalter Five Way interview

Professor Sir David Spiegelhalter FRS OBE is Emeritus Professor of Statistics in the Centre for Mathematical Sciences at the University of Cambridge. He was previously Chair of the Winton Centre for Risk and Evidence Communication and has presented the BBC4 documentaries Tails you Win: the Science of Chance, the award-winning Climate Change by Numbers. His bestselling book, The Art of Statistics , was published in March 2019. He was knighted in 2014 for services to medical statistics, was President of the Royal Statistical Society (2017-2018), and became a Non-Executive Director of the UK Statistics Authority in 2020. His latest book is The Art of Uncertainty . Why probability? because I have been fascinated by the idea of probability, and what it might be, for over 50 years. Why is the ‘P’ word missing from the title? That's a good question.  Partly so as not to make it sound like a technical book, but also because I did not want to give the impression that it was yet another book

Everything is Predictable - Tom Chivers *****

There's a stereotype of computer users: Mac users are creative and cool, while PC users are businesslike and unimaginative. Less well-known is that the world of statistics has an equivalent division. Bayesians are the Mac users of the stats world, where frequentists are the PC people. This book sets out to show why Bayesians are not just cool, but also mostly right. Tom Chivers does an excellent job of giving us some historical background, then dives into two key aspects of the use of statistics. These are in science, where the standard approach is frequentist and Bayes only creeps into a few specific applications, such as the accuracy of medical tests, and in decision theory where Bayes is dominant. If this all sounds very dry and unexciting, it's quite the reverse. I admit, I love probability and statistics, and I am something of a closet Bayesian*), but Chivers' light and entertaining style means that what could have been the mathematical equivalent of debating angels on