Skip to main content

Who Invented the Computer? – Ian Watson

Ian Watson is the author of The Universal Machine and featured in a recent Four Way Interview.
Saturday June 23 2012 was the centenary of the birth of Alan Turing, the troubled genius who invented the modern computer. Why though do so few people recognize his name and his great achievements?
In 1936 English mathematician Alan Turing, published a paper, On Computable Numbers, with an application to the Entscheidungsproblem. This became the foundation of computing. In it Turing presented a theoretical machine that could solve any problem that could be described by instructions encoded on a paper tape. A Turing Machine could calculate square roots, whilst another might solve Sudoku puzzles. Turing demonstrated you could construct a single Universal Machine that could simulate any Turing Machine. One machine solving any problem for which a program could be written – sound familiar? He’d invented the computer.
Then, computers were people who did calculations. As the Allies prepared for WWII they faced a shortage of computers for military calculations. When men left for war the shortage got worse so the US mechanized the problem building the Harvard Mark 1; it could do calculations in seconds that took a person hours. The British also needed mathematicians to crack the Nazi’s Enigma code.
Turing worked at Bletchley Park, perhaps better know as “Station X,” where code-breaking became an industrial process; 12,000 people working 24/7. Although the Polish had cracked Enigma before the war the Nazis had made Enigma more complicated; there were 10114 permutations. Turing designed a machine, called the Bombe, that searched through the permutations and by war’s end the British were reading all Enigma traffic. Historians agree that Turing shortened the war by as much as two years and Churchill would later say that Turing had made the single biggest contribution to Allied victory in the war.
As the 1950s progressed business was quick to use computers and as the technology advanced business computing became an industry. These computers were all universal machines – you could program them to do anything.
There will positively be no internal alteration [of the computer] to be made even if we wish suddenly to switch from calculating the energy levels of the neon atom to the enumeration of groups of order 720. It may appear somewhat puzzling that this can be done. How can one expect a machine to do all this multitudinous variety of things? The answer is that we should consider the machine to be doing something quite simple, namely carrying out orders given to it in a standard form which it is able to understand. – Alan Turing
By the 1970s a generation was born who grew up with “electronic brains;” they wanted their own personal computers. The problem was they had to build them. In 1975 a college dropout called Steve Wozniak built a simple computer around the 8080 microprocessor, which he hooked up to a keyboard and TV. His friend, Steve Jobs, called it the Apple I, and found a Silicon Valley shop that would buy 100 for $500 each. Apple had its first sale and Silicon Valley’s start-up culture was born. Another dropout, Bill Gates, realized that PCs needed software and that people would pay for it – Microsoft would sell them programs.
Turing had another vision, one day computers would think? But, how would you know a computer was intelligent? He devised the Turing Test; a judge sitting at a computer terminal types questions to two entities: a person and a computer. The judge decides which entity is human. If the judge is wrong the computer passes the test and is intelligent.
Artificial intelligence (AI) is entering your daily life. Car satnavs and Google search use AI, Apple’s iPhone can understand your voice and intelligently respond, car manufacturers are developing autonomous cars. Turing’s vision of AI will soon be a reality.
In 1952 Turing was prosecuted for being gay and was sentenced to chemical castration. This caused depression and he committed suicide by eating an apple he’d poisoned. Outside of academia Turing remained virtually unknown because his WWII work was top secret. Slowly word of Turing’s genius spread; in 1999 Time Magazine named him as one of the “100 Most Important People of the 20th Century,” stating: “The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine.” and in 2009 the British Prime Minister issued a public apology:
…on behalf of the British government, and all those who live freely thanks to Alan’s work, I am very proud to say: we’re sorry. You deserved so much better.
Finally Alan Turing is getting the recognition he deserves for inventing the computer, his Universal Machine that has transformed our world and will profoundly influence our futures.

Comments

Popular posts from this blog

On the Fringe - Michael Gordin *****

This little book is a pleasant surprise. That word 'little', by the way, is not intended as an insult, but a compliment. Kudos to OUP for realising that a book doesn't have to be three inches thick to be interesting. It's just 101 pages before you get to the notes - and that's plenty. The topic is fringe science or pseudoscience: it could be heavy going in a condensed form, but in fact Michael Gordin keeps the tone light and readable. In some ways, the most interesting bit is when Gordin plunges into just what pseudoscience actually is. As he points out, there are elements of subjectivity to this. For example, some would say that string theory is pseudoscience, even though many real scientists have dedicated their careers to it. Gordin also points out that, outside of denial (more on this a moment), many supporters of what most of us label pseudoscience do use the scientific method and see themselves as doing actual science. Gordin breaks pseudoscience down into a n

A (Very) Short History of Life on Earth - Henry Gee *****

In writing this book, Henry Gee had a lot to live up to. His earlier title  The Accidental Species was a superbly readable and fascinating description of the evolutionary process leading to Homo sapiens . It seemed hard to beat - but he has succeeded with what is inevitably going to be described as a tour-de-force. As is promised on the cover, we are taken through nearly 4.6 billion years of life on Earth (actually rather more, as I'll cover below). It's a mark of Gee's skill that what could have ended up feeling like an interminable list of different organisms comes across instead as something of a pager turner. This is helped by the structuring - within those promised twelve chapters everything is divided up into handy bite-sized chunks. And although there certainly are very many species mentioned as we pass through the years, rather than feeling overwhelming, Gee's friendly prose and careful timing made the approach come across as natural and organic.  There was a w

Michael D. Gordin - Four Way Interview

Michael D. Gordin is a historian of modern science and a professor at Princeton University, with particular interests in the physical sciences and in science in Russia and the Soviet Union. He is the author of six books, ranging from the periodic table to early nuclear weapons to the history of scientific languages. His most recent book is On the Fringe: Where Science Meets Pseudoscience (Oxford University Press). Why history of science? The history of science grabbed me long before I knew that there were actual historians of science out there. I entered college committed to becoming a physicist, drawn in by the deep intellectual puzzles of entropy, quantum theory, and relativity. When I started taking courses, I came to understand that what really interested me about those puzzles were not so much their solutions — still replete with paradoxes — but rather the rich debates and even the dead-ends that scientists had taken to trying to resolve them. At first, I thought this fell under