Skip to main content

Who Invented the Computer? – Ian Watson

Ian Watson is the author of The Universal Machine and featured in a recent Four Way Interview.
Saturday June 23 2012 was the centenary of the birth of Alan Turing, the troubled genius who invented the modern computer. Why though do so few people recognize his name and his great achievements?
In 1936 English mathematician Alan Turing, published a paper, On Computable Numbers, with an application to the Entscheidungsproblem. This became the foundation of computing. In it Turing presented a theoretical machine that could solve any problem that could be described by instructions encoded on a paper tape. A Turing Machine could calculate square roots, whilst another might solve Sudoku puzzles. Turing demonstrated you could construct a single Universal Machine that could simulate any Turing Machine. One machine solving any problem for which a program could be written – sound familiar? He’d invented the computer.
Then, computers were people who did calculations. As the Allies prepared for WWII they faced a shortage of computers for military calculations. When men left for war the shortage got worse so the US mechanized the problem building the Harvard Mark 1; it could do calculations in seconds that took a person hours. The British also needed mathematicians to crack the Nazi’s Enigma code.
Turing worked at Bletchley Park, perhaps better know as “Station X,” where code-breaking became an industrial process; 12,000 people working 24/7. Although the Polish had cracked Enigma before the war the Nazis had made Enigma more complicated; there were 10114 permutations. Turing designed a machine, called the Bombe, that searched through the permutations and by war’s end the British were reading all Enigma traffic. Historians agree that Turing shortened the war by as much as two years and Churchill would later say that Turing had made the single biggest contribution to Allied victory in the war.
As the 1950s progressed business was quick to use computers and as the technology advanced business computing became an industry. These computers were all universal machines – you could program them to do anything.
There will positively be no internal alteration [of the computer] to be made even if we wish suddenly to switch from calculating the energy levels of the neon atom to the enumeration of groups of order 720. It may appear somewhat puzzling that this can be done. How can one expect a machine to do all this multitudinous variety of things? The answer is that we should consider the machine to be doing something quite simple, namely carrying out orders given to it in a standard form which it is able to understand. – Alan Turing
By the 1970s a generation was born who grew up with “electronic brains;” they wanted their own personal computers. The problem was they had to build them. In 1975 a college dropout called Steve Wozniak built a simple computer around the 8080 microprocessor, which he hooked up to a keyboard and TV. His friend, Steve Jobs, called it the Apple I, and found a Silicon Valley shop that would buy 100 for $500 each. Apple had its first sale and Silicon Valley’s start-up culture was born. Another dropout, Bill Gates, realized that PCs needed software and that people would pay for it – Microsoft would sell them programs.
Turing had another vision, one day computers would think? But, how would you know a computer was intelligent? He devised the Turing Test; a judge sitting at a computer terminal types questions to two entities: a person and a computer. The judge decides which entity is human. If the judge is wrong the computer passes the test and is intelligent.
Artificial intelligence (AI) is entering your daily life. Car satnavs and Google search use AI, Apple’s iPhone can understand your voice and intelligently respond, car manufacturers are developing autonomous cars. Turing’s vision of AI will soon be a reality.
In 1952 Turing was prosecuted for being gay and was sentenced to chemical castration. This caused depression and he committed suicide by eating an apple he’d poisoned. Outside of academia Turing remained virtually unknown because his WWII work was top secret. Slowly word of Turing’s genius spread; in 1999 Time Magazine named him as one of the “100 Most Important People of the 20th Century,” stating: “The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine.” and in 2009 the British Prime Minister issued a public apology:
…on behalf of the British government, and all those who live freely thanks to Alan’s work, I am very proud to say: we’re sorry. You deserved so much better.
Finally Alan Turing is getting the recognition he deserves for inventing the computer, his Universal Machine that has transformed our world and will profoundly influence our futures.

Comments

Popular posts from this blog

Cosmology for the Curious - Delia Perlov and Alex Vilenkin ***

In the recently published The Little Book of Black Holes we saw what I thought was pretty much impossible - a good, next level, general audience science title, spanning the gap between a typical popular science book and an introductory textbook, but very much in the style of popular science. Cosmology for the Curious does something similar, but coming from the other direction. This is an introductory textbook, intended for first year physics students, with familiar textbook features like questions to answer at the end of each chapter. Yet by incorporating some history and context, plus taking a more relaxed style in the writing, it's certainly more approachable than a typical textbook.

The first main section, The Big Bang and the Observable Universe not only covers basic big bang cosmology but fills in the basics of special and general relativity, Hubble's law, dark matter, dark energy and more. We then move onto the more speculative (this is cosmology, after all) aspects, brin…

Astrophysics for People in a Hurry – Neil deGrasse Tyson *****

When I reviewed James Binney’s Astrophysics: A Very Short Introduction earlier this year, I observed that the very word ‘astrophysics’ in a book’s title is liable to deter many readers from buying it. As a former astrophysicist myself, I’ve never really understood why it’s considered such a scary word, but that’s the way it is. So I was pleasantly surprised to learn, from Wikipedia, that this new book by Neil deGrasse Tyson ‘topped The New York Times non-fiction bestseller list for four weeks in the middle of 2017’.

Like James Binney, Tyson is a professional astrophysicist with a string of research papers to his name – but he’s also one of America’s top science popularisers, and that’s the hat he’s wearing in this book. While Binney addresses an already-physics-literate audience, Tyson sets his sights on a much wider readership. It’s actually very brave – and honest – of him to give physics such prominent billing; the book could easily have been given a more reader-friendly title such …

Once upon and Algorithm - Martin Erwig ***

I've been itching to start reading this book for some time, as the premise was so intriguing - to inform the reader about computer science and algorithms using stories as analogies to understand the process.

This is exactly what Martin Erwig does, starting (as the cover suggests) with Hansel and Gretel, and then bringing in Sherlock Holmes (and particularly The Hound of the Baskervilles), Indiana Jones, the song 'Over the Rainbow' (more on that in a moment), Groundhog Day, Back to the Future and Harry Potter.

The idea is to show how some aspect of the story - in the case of Hansel and Gretel, laying a trail of stones/breadcrumbs, then attempting to follow them home - can be seen as a kind of algorithm or computation and gradually adding in computing standards, such as searching, queues and lists, loops, recursion and more.

This really would have been a brilliant book if Erwig had got himself a co-author who knew how to write for the public, but sadly the style is mostly heavy…