Skip to main content

Who Invented the Computer? – Ian Watson

Ian Watson is the author of The Universal Machine and featured in a recent Four Way Interview.
Saturday June 23 2012 was the centenary of the birth of Alan Turing, the troubled genius who invented the modern computer. Why though do so few people recognize his name and his great achievements?
In 1936 English mathematician Alan Turing, published a paper, On Computable Numbers, with an application to the Entscheidungsproblem. This became the foundation of computing. In it Turing presented a theoretical machine that could solve any problem that could be described by instructions encoded on a paper tape. A Turing Machine could calculate square roots, whilst another might solve Sudoku puzzles. Turing demonstrated you could construct a single Universal Machine that could simulate any Turing Machine. One machine solving any problem for which a program could be written – sound familiar? He’d invented the computer.
Then, computers were people who did calculations. As the Allies prepared for WWII they faced a shortage of computers for military calculations. When men left for war the shortage got worse so the US mechanized the problem building the Harvard Mark 1; it could do calculations in seconds that took a person hours. The British also needed mathematicians to crack the Nazi’s Enigma code.
Turing worked at Bletchley Park, perhaps better know as “Station X,” where code-breaking became an industrial process; 12,000 people working 24/7. Although the Polish had cracked Enigma before the war the Nazis had made Enigma more complicated; there were 10114 permutations. Turing designed a machine, called the Bombe, that searched through the permutations and by war’s end the British were reading all Enigma traffic. Historians agree that Turing shortened the war by as much as two years and Churchill would later say that Turing had made the single biggest contribution to Allied victory in the war.
As the 1950s progressed business was quick to use computers and as the technology advanced business computing became an industry. These computers were all universal machines – you could program them to do anything.
There will positively be no internal alteration [of the computer] to be made even if we wish suddenly to switch from calculating the energy levels of the neon atom to the enumeration of groups of order 720. It may appear somewhat puzzling that this can be done. How can one expect a machine to do all this multitudinous variety of things? The answer is that we should consider the machine to be doing something quite simple, namely carrying out orders given to it in a standard form which it is able to understand. – Alan Turing
By the 1970s a generation was born who grew up with “electronic brains;” they wanted their own personal computers. The problem was they had to build them. In 1975 a college dropout called Steve Wozniak built a simple computer around the 8080 microprocessor, which he hooked up to a keyboard and TV. His friend, Steve Jobs, called it the Apple I, and found a Silicon Valley shop that would buy 100 for $500 each. Apple had its first sale and Silicon Valley’s start-up culture was born. Another dropout, Bill Gates, realized that PCs needed software and that people would pay for it – Microsoft would sell them programs.
Turing had another vision, one day computers would think? But, how would you know a computer was intelligent? He devised the Turing Test; a judge sitting at a computer terminal types questions to two entities: a person and a computer. The judge decides which entity is human. If the judge is wrong the computer passes the test and is intelligent.
Artificial intelligence (AI) is entering your daily life. Car satnavs and Google search use AI, Apple’s iPhone can understand your voice and intelligently respond, car manufacturers are developing autonomous cars. Turing’s vision of AI will soon be a reality.
In 1952 Turing was prosecuted for being gay and was sentenced to chemical castration. This caused depression and he committed suicide by eating an apple he’d poisoned. Outside of academia Turing remained virtually unknown because his WWII work was top secret. Slowly word of Turing’s genius spread; in 1999 Time Magazine named him as one of the “100 Most Important People of the 20th Century,” stating: “The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine.” and in 2009 the British Prime Minister issued a public apology:
…on behalf of the British government, and all those who live freely thanks to Alan’s work, I am very proud to say: we’re sorry. You deserved so much better.
Finally Alan Turing is getting the recognition he deserves for inventing the computer, his Universal Machine that has transformed our world and will profoundly influence our futures.


Popular posts from this blog

Four Way Interview - Tom Cabot

Tom Cabot is a London-based book editor and designer with a background in experimental psychology, natural science and graphic design. He founded the London-based packaging company, Ketchup, and has produced and illustrated many books for the British Film Institute, Penguin and the Royal Institute of British Architects. Tom has held a lifelong passion to explain science graphically and inclusively ... ever since being blown away by Ray and Charles Eames’ Powers of Ten at an early age. His first book is Eureka, an infographic guide to science.

Why infographics?
For me infographics provided a way to present heavy-lifting science in an alluring and playful, but ultimately illuminating, way. And I love visualising data and making it as attractive as the ideas are.  The novelty of the presentation hopefully gets the reader to look afresh. I love the idea of luring in readers who might normally be put off by drier, more monotone science – people who left science behind at 16. I wanted the boo…

Einstein's Greatest Mistake - David Bodanis ****

Books on Einstein and his work are not exactly thin on the ground. There's even been more than one book before with a title centring on Einstein's mistake or mistakes. So to make a new title worthwhile it has do something different - and David Bodanis certainly achieves this with Einstein's Greatest Mistake. If I'm honest, the book isn't the greatest on the science or the history - but what it does superbly is tell a story. The question we have to answer is why that justifies considering this to be a good book.
I would compare Einstein's Greatest Mistake with the movie Lincoln -  it is, in effect, a biopic in book form with all the glory and flaws that can bring. Compared with a good biography, a biopic will distort the truth and emphasise parts of the story that aren't significant because they make for a good screen scene. But I would much rather someone watched the movie than never found out anything about Lincoln - and similarly I'd much rather someon…

A Tale of Seven Scientists - Eric Scerri ***

Scientists sometimes tell us we're in a post-philosophy world. For example, Stephen Hawking and Leonard Mlodinow in The Grand Design bluntly say that that philosophy is 'dead' - no longer required, as science can do its job far better. However, other scientists recognise the benefits of philosophy, particularly when it is applied to their own discipline. One such is Eric Scerri, probably the world's greatest expert on the periodic table, who in this challenging book sets out to modify the philosophical models of scientific progress.

I ought to say straight away that A Tale of Seven Scientists sits somewhere on the cusp between popular science and a heavy duty academic title. For reasons that will become clear, I could only give it three stars if rating it as popular science, but it deserves more if we don't worry too much about it being widely accessible.

One minor problem with accessibility is that I've never read a book that took so long to get started. First t…