Skip to main content

Writers are human

More like a similar number of bacteria
One of my favourite things about being a science writer is getting emails from readers. Some say nice things about a book, others ask questions about science or writing which I do my best to respond to helpfully. However, every now and then there is an email that's pointing out an error in one of my books - and I have to admit that those are distinctly depressing.

Let's be clear, I can say with reasonable confidence that every book I've ever written has a mistake or two in it. This isn't because I know what all the mistakes are - if I did, I would have corrected them before publication - but rather because pretty well every book I've ever read, fiction or non-fiction, has mistakes in it. It's just a fact of life. A book typically has between 60,000 and 120,000 words in it - the chances of an error slipping through is pretty high.

Many of these are typos. Spelling mistakes, words missing, something that should be spotted in a general edit, but that often slips through. The worst of these can seem glaring when you spot them as a reader, but the reality of proof reading is that it can be very difficult to spot them. Once I'm reading, I can easily get absorbed in the content and miss the detail. Although professional proof readers are excellent, possessing an attention to detail that I could never match, even they miss some.

I have a suspicion that the move of proofs from paper to electronic form has not helped. I don't know if it's been researched in any detail, but there's a widely held belief that we are better at spotting errors on paper than on the screen. Time was when publishers sent out paper proofs. I used to be distinctly nervous when a US publisher would send me proofs across the Atlantic that had already been marked up by a proof reader - if they ever got lost in the mail all that work would have to be done again. I'm afraid there are some publishers now that seem to do hardly any proof reading (not my publishers, by the way), so I do occasionally get sent books littered with typos, but most still do a reasonable job.

However, the errors that are most likely to get my readers excited are factual errors. A historical observation that's wrong or a scientific 'fact' that doesn't hold true. Sometimes this reflects the gap between the book being written and the reader picking it up. Some of my books written, say, 20 years ago are still available. That's great - I'm not complaining. But science moves on. One example from a more recent title - in The Universe Inside You I mention that there are as many as ten times more bacteria than human cells in your body (see snippet above). This was the accepted figure at the time of writing, though it was based on a very vague guesstimate. The guess has now been revised to be a similar number of bacterial and human cells, so I was out by an order of magnitude - but it was not a recognisable error at the time of writing.

Other times I (or others involved in the book production) just make a mistake. Guess what - writers are human. In my biography of James Clerk Maxwell, Professor Maxwell's Duplicitous Demon, there were two interesting errors in this regard. I simply misread a number for the population of the Lancashire mill town of Bolton (I picked up the number from an adjacent line), meaning that the hardback has me incorrectly saying that Aberdeen in 1856 was only half the size of Bolton. (This was particularly embarrassing as it was picked up as an interesting fact in a Scottish newspaper.) By the time the paperback was published, I was able to correct this to say Aberdeen was around the size of Bolton.

The incorrect equations
(bottom right B should be a D)
In the same book, I showed the iconic Maxwell's equations for electromagnetism, in one of the compact forms devised by Oliver Heaviside. I put these in correctly, but equations always have to be reset by the publisher and I didn't notice an error was introduced in doing this. This was pointed out to me in a discussion on my book at the Cheltenham Science Festival. The other contributor was a physicist. I had listened to her mangle a piece of history by attributing a quote to Maxwell that was actually said by someone else, but thought it would cause unnecessary embarrassment to point out the error. She had no such consideration in pointing out that the equation (which was only there to show the beauty of the form) was incorrect.

In other cases, the error could be an oversimplification. In writing up their work, scientists are very careful about being as precise as they can, which can result in lacing any statement with provisos. These don't make for great reading, and part of the job of a science writer is to trim down the provisos without becoming too inaccurate. Interestingly, scientists writing popular science tend to oversimplify more than professional science writers - in part because they like to present their own pet theories, or the most widely accepted theory, as if it were fact. I've lost count of the number of physicists who, for example, say that dark matter exists, rather than is one option to explain some anomalies, alongside modified gravity theories etc.

I have certainly been guilty of this in not always making clear in reporting something said by a historical figure that the saying is disputed or may have been originally been said by someone else. I recently got pulled up for attributing a comment to Lord Kelvin that definitely didn't occur on the occasion it has often been said to have occurred, and may not have been said by Kelvin at all. Yet a number of respectable sources do allocate it to Kelvin - so I feel no shame in using the quote, but should have made the doubts about it clear.

So we writers do make mistakes. But do we want to be told about them? On a personal level, the answer is probably usually 'No.' No one likes being told 'you were wrong' - though a lot can depend on how we are told. An email I recently got about that Kelvin issue had the subject line 'Kelvin, being smug and...' - it feels like a good way to irritate someone. Probably the most important thing about telling an author about an error is timeliness. Please make sure the book was published in the last year or so.

If you point out an error in a book that was published a number of years ago, the chances are you are not the first - having a stream of people pointing out the same mistake is not particularly helpful. More to the point, unless the book is a continuing bestseller, it's probably too late to do anything about it. If it's caught early, the error can be corrected in later editions, but once most copies have already been printed - which for many books is in the first year - it's pretty much a fruitless exercise, all pain and no gain. (Admittedly ebooks can always be updated, but publishers rarely bother.)

To reiterate - we authors love hearing from readers. But bear in mind (again) that we are human. Our books are a bit like our children - we know they have their flaws, but don't necessarily appreciate having them pointed out.

Brian Clegg

Comments

Popular posts from this blog

The Antigravity Enigma - Andrew May ****

Antigravity - the ability to overcome the pull of gravity - has been a fantasy for thousands of years and subject to more scientific (if impractical) fictional representation since H. G. Wells came up with cavorite in The First Men in the Moon . But is it plausible scientifically?  Andrew May does a good job of pulling together three ways of looking at our love affair with antigravity (and the related concept of cancelling inertia) - in science fiction, in physics and in pseudoscience and crankery. As May points out, science fiction is an important starting point as the concept was deployed there well before we had a good enough understanding of gravity to make any sensible scientific stabs at the idea (even though, for instance, Michael Faraday did unsuccessfully experiment with a possible interaction between gravity and electromagnetism). We then get onto the science itself, noting the potential impact on any ideas of antigravity that come from the move from a Newtonian view of a...

The World as We Know It - Peter Dear ***

History professor Peter Dear gives us a detailed and reasoned coverage of the development of science as a concept from its origins as natural philosophy, covering the years from the eighteenth to the twentieth century. inclusive If that sounds a little dry, frankly, it is. But if you don't mind a very academic approach, it is certainly interesting. Obviously a major theme running through is the move from largely gentleman natural philosophers (with both implications of that word 'gentleman') to professional academic scientists. What started with clubs for relatively well off men with an interest, when universities did not stray far beyond what was included in mathematics (astronomy, for instance), would become a very different beast. The main scientific subjects that Dear covers are physics and biology - we get, for instance, a lot on the gradual move away from a purely mechanical views of physics - the reason Newton's 'action at a distance' gravity caused such ...

It's On You - Nick Chater and George Loewenstein *****

Going on the cover you might think this was a political polemic - and admittedly there's an element of that - but the reason it's so good is quite different. It shows how behavioural economics and social psychology have led us astray by putting the focus way too much on individuals. A particular target is the concept of nudges which (as described in Brainjacking ) have been hugely over-rated. But overall the key problem ties to another psychological concept: framing. Huge kudos to both Nick Chater and George Loewenstein - a behavioural scientist and an economics and psychology professor - for having the guts to take on the flaws in their own earlier work and that of colleagues, because they make clear just how limited and potentially dangerous is the belief that individuals changing their behaviour can solve large-scale problems. The main thesis of the book is that there are two ways to approach the major problems we face - an 'i-frame' where we focus on the individual ...