Skip to main content

The Cryotron Files - Iain Dey and Douglas Buck ****

This is a rip-roaring tale of remarkable technological achievements, cold war spying and a suspicious death at a very early age that has inevitably fostered conspiracy theories. Dudley Buck, the subject of the biography, made three hugely important contributions to computer science - yet he's still not widely known. I've read many books on the history of computer science, and this is the first time I've ever heard of him.

We start off with fairly familiar territory with Buck's background - it might feel a little dull - but once he's involved in computing, things get a whole lot more interesting. About the only aspect of the early biography that stands out is that Buck had an extremely unpleasant idea of what constitutes a prank, including electrocuting people and trying to build a bomb on campus. However, though he apparently continued as a practical joker when older, it seems his attempts, while still malicious, became less life-threatening.

In terms of computing technology, Buck was a key figure in the development of the magnetic core memory that was the mainstay of computing in the 60s, was producing lithographic integrated circuits well before the famous names of the microchip world, and devised a ferroelectric memory that was impractical at the time, but has since become a real thing. And all this before dying at the tragically young age of 32. In fact, the ferroelectric memory was in masters dissertation, and he didn't get a doctorate until shockingly late, having already made huge contributions.

Of itself, his computer engineering is impressive, but what makes the story far more intriguing is Buck's involvement in the shady world of 1960s espionage. He did a considerable amount of work for the NSA and was regularly sent off on missions, at least one to the Soviet Union, the details of which are sometimes still fuzzy, but making him a far more interesting character for a biography. And then there's his death. Buck died of a pulmonary condition. It was immediately after opening a package containing a wide range of chemical substances for use in his experimental work - and not long after the visit of a number of soviet scientists. At the time of writing this review, relatively soon after the Skripal affair, it's hard not to give at least some weight to the speculation that his death - working as he was on technology that could be used in guided missiles - was not accidental.

There is, sadly, one real disappointment with this book. Authors Iain Dey (a business journalist) and Douglas Buck (Buck's son) used a researcher to dig up historical material. It's a shame they didn't also have a science consultant, because the science and technology part of the book is dire. Luckily, it's almost incidental to the way the story is presented - it's far more about people and history, but it's a real shame that it couldn't have been better.

To give an example, the 'cryotron' in the title of the book (of which more in a moment) was a superconducting device. Early on we are told superconductors are 'chemical elements that only conduct electricity at ultralow temperatures.' Leaving aside that most modern superconductors aren't chemical elements, the specific elements mentioned through the book, and which later we are told 'naturally blocked an electric current at room temperature' are lead, tantalum and niobium - all reasonably good conductors at room temperature. If the authors think lead blocks an electric current, I hope they don't have cars that use lead-acid batteries.

That's weak basic science, but even the computer science has problems. We're told that 'The acronym RAM soon stuck and is still used today as an indication of a computer's processing speed.' Really? But the biggest issue is the way the cryotron is handled. This was a potential replacement for valves and transistors (which at the time were typically as much as a centimetre across). The cryotron was a tiny device that could fulfil a similar role. Unfortunately it was a dead end. It was slower than transistors and crucially could only work if supercooled with liquid helium. It might have had niche applications, but the need to keep it at a couple of degrees above absolute zero would always prevent it from being mainstream. As it happens, within a few years transistors on microchips had left it way behind.

Dey and Buck spoil the genuinely huge significance of Buck senior's work on magnetic cores, lithography and ferroelectric memory by overplaying the importance of the cryotron. At one point they say Dudley Buck 'had invented a whole new field of physics and electrical engineering.' There was no new physics here. Elsewhere they claim that the cryotron 'evolved into a device called a Josephson junction.' This is ludicrous. It's like saying a 13 amp plug 'evolved into a microchip' as they both carry electricity, involve metal junctions and work at room temperature. A Josephson junction was a totally new concept, derived from basic physics and to suggest that it had any relation to a cryotron is an insult to Brian Josephson. The authors even go so far as to suggest that a quantum computer 'runs on modified cryotrons.' No, it really doesn't. This is an attempt to over-inflate the importance of one of Buck's ideas that was a failure. It was good idea, but it happened not to work out. Technology development is like that.

There is no doubt, then, that there are issues with this book, but I come back to the to key aspects that make it a great read. Buck was a genius - what he achieved in computer engineering in a short timescale (especially given the amount of time he spent on other things) was truly remarkable. And his life story, intwined as it was with Cold War politics and espionage, equally makes for a fascinating insight into unsettling times.

Hardback:  

Kindle:  
Using these links earns us commission at no cost to you

Review by Brian Clegg

Comments

Popular posts from this blog

David Spiegelhalter Five Way interview

Professor Sir David Spiegelhalter FRS OBE is Emeritus Professor of Statistics in the Centre for Mathematical Sciences at the University of Cambridge. He was previously Chair of the Winton Centre for Risk and Evidence Communication and has presented the BBC4 documentaries Tails you Win: the Science of Chance, the award-winning Climate Change by Numbers. His bestselling book, The Art of Statistics , was published in March 2019. He was knighted in 2014 for services to medical statistics, was President of the Royal Statistical Society (2017-2018), and became a Non-Executive Director of the UK Statistics Authority in 2020. His latest book is The Art of Uncertainty . Why probability? because I have been fascinated by the idea of probability, and what it might be, for over 50 years. Why is the ‘P’ word missing from the title? That's a good question.  Partly so as not to make it sound like a technical book, but also because I did not want to give the impression that it was yet another book

The Genetic Book of the Dead: Richard Dawkins ****

When someone came up with the title for this book they were probably thinking deep cultural echoes - I suspect I'm not the only Robert Rankin fan in whom it raised a smile instead, thinking of The Suburban Book of the Dead . That aside, this is a glossy and engaging book showing how physical makeup (phenotype), behaviour and more tell us about the past, with the messenger being (inevitably, this being Richard Dawkins) the genes. Worthy of comment straight away are the illustrations - this is one of the best illustrated science books I've ever come across. Generally illustrations are either an afterthought, or the book is heavily illustrated and the text is really just an accompaniment to the pictures. Here the full colour images tie in directly to the text. They are not asides, but are 'read' with the text by placing them strategically so the picture is directly with the text that refers to it. Many are photographs, though some are effective paintings by Jana Lenzová. T

Everything is Predictable - Tom Chivers *****

There's a stereotype of computer users: Mac users are creative and cool, while PC users are businesslike and unimaginative. Less well-known is that the world of statistics has an equivalent division. Bayesians are the Mac users of the stats world, where frequentists are the PC people. This book sets out to show why Bayesians are not just cool, but also mostly right. Tom Chivers does an excellent job of giving us some historical background, then dives into two key aspects of the use of statistics. These are in science, where the standard approach is frequentist and Bayes only creeps into a few specific applications, such as the accuracy of medical tests, and in decision theory where Bayes is dominant. If this all sounds very dry and unexciting, it's quite the reverse. I admit, I love probability and statistics, and I am something of a closet Bayesian*), but Chivers' light and entertaining style means that what could have been the mathematical equivalent of debating angels on