Skip to main content

Who Invented the Computer? – Ian Watson

Ian Watson is the author of The Universal Machine and featured in a recent Four Way Interview.
Saturday June 23 2012 was the centenary of the birth of Alan Turing, the troubled genius who invented the modern computer. Why though do so few people recognize his name and his great achievements?
In 1936 English mathematician Alan Turing, published a paper, On Computable Numbers, with an application to the Entscheidungsproblem. This became the foundation of computing. In it Turing presented a theoretical machine that could solve any problem that could be described by instructions encoded on a paper tape. A Turing Machine could calculate square roots, whilst another might solve Sudoku puzzles. Turing demonstrated you could construct a single Universal Machine that could simulate any Turing Machine. One machine solving any problem for which a program could be written – sound familiar? He’d invented the computer.
Then, computers were people who did calculations. As the Allies prepared for WWII they faced a shortage of computers for military calculations. When men left for war the shortage got worse so the US mechanized the problem building the Harvard Mark 1; it could do calculations in seconds that took a person hours. The British also needed mathematicians to crack the Nazi’s Enigma code.
Turing worked at Bletchley Park, perhaps better know as “Station X,” where code-breaking became an industrial process; 12,000 people working 24/7. Although the Polish had cracked Enigma before the war the Nazis had made Enigma more complicated; there were 10114 permutations. Turing designed a machine, called the Bombe, that searched through the permutations and by war’s end the British were reading all Enigma traffic. Historians agree that Turing shortened the war by as much as two years and Churchill would later say that Turing had made the single biggest contribution to Allied victory in the war.
As the 1950s progressed business was quick to use computers and as the technology advanced business computing became an industry. These computers were all universal machines – you could program them to do anything.
There will positively be no internal alteration [of the computer] to be made even if we wish suddenly to switch from calculating the energy levels of the neon atom to the enumeration of groups of order 720. It may appear somewhat puzzling that this can be done. How can one expect a machine to do all this multitudinous variety of things? The answer is that we should consider the machine to be doing something quite simple, namely carrying out orders given to it in a standard form which it is able to understand. – Alan Turing
By the 1970s a generation was born who grew up with “electronic brains;” they wanted their own personal computers. The problem was they had to build them. In 1975 a college dropout called Steve Wozniak built a simple computer around the 8080 microprocessor, which he hooked up to a keyboard and TV. His friend, Steve Jobs, called it the Apple I, and found a Silicon Valley shop that would buy 100 for $500 each. Apple had its first sale and Silicon Valley’s start-up culture was born. Another dropout, Bill Gates, realized that PCs needed software and that people would pay for it – Microsoft would sell them programs.
Turing had another vision, one day computers would think? But, how would you know a computer was intelligent? He devised the Turing Test; a judge sitting at a computer terminal types questions to two entities: a person and a computer. The judge decides which entity is human. If the judge is wrong the computer passes the test and is intelligent.
Artificial intelligence (AI) is entering your daily life. Car satnavs and Google search use AI, Apple’s iPhone can understand your voice and intelligently respond, car manufacturers are developing autonomous cars. Turing’s vision of AI will soon be a reality.
In 1952 Turing was prosecuted for being gay and was sentenced to chemical castration. This caused depression and he committed suicide by eating an apple he’d poisoned. Outside of academia Turing remained virtually unknown because his WWII work was top secret. Slowly word of Turing’s genius spread; in 1999 Time Magazine named him as one of the “100 Most Important People of the 20th Century,” stating: “The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine.” and in 2009 the British Prime Minister issued a public apology:
…on behalf of the British government, and all those who live freely thanks to Alan’s work, I am very proud to say: we’re sorry. You deserved so much better.
Finally Alan Turing is getting the recognition he deserves for inventing the computer, his Universal Machine that has transformed our world and will profoundly influence our futures.

Comments

Popular posts from this blog

Why Nobody Understands Quantum Physics - Frank Verstraete and Céline Broeckaert **

It's with a heavy heart that I have to say that I could not get on with this book. The structure is all over the place, while the content veers from childish remarks to unexplained jargon. Frank Versraete is a highly regarded physicist and knows what he’s talking about - but unfortunately, physics professors are not always the best people to explain physics to a general audience and, possibly contributed to by this being a translation, I thought this book simply doesn’t work. A small issue is that there are few historical inaccuracies, but that’s often the case when scientists write history of science, and that’s not the main part of the book so I would have overlooked it. As an example, we are told that Newton's apple story originated with Voltaire. Yet Newton himself mentioned the apple story to William Stukeley in 1726. He may have made it up - but he certainly originated it, not Voltaire. We are also told that ‘Galileo discovered the counterintuitive law behind a swinging o...

Ctrl+Alt+Chaos - Joe Tidy ****

Anyone like me with a background in programming is likely to be fascinated (if horrified) by books that present stories of hacking and other destructive work mostly by young males, some of whom have remarkable abilities with code, but use it for unpleasant purposes. I remember reading Clifford Stoll's 1990 book The Cuckoo's Egg about the first ever network worm (the 1988 ARPANet worm, which accidentally did more damage than was intended) - the book is so engraved in my mind I could still remember who the author was decades later. This is very much in the same vein,  but brings the story into the true internet age. Joe Tidy gives us real insights into the often-teen hacking gangs, many with members from the US and UK, who have caused online chaos and real harm. These attacks seem to have mostly started as pranks, but have moved into financial extortion and attempts to destroy others' lives through doxing, swatting (sending false messages to the police resulting in a SWAT te...

Battle of the Big Bang - Niayesh Afshordi and Phil Harper *****

It's popular science Jim, but not as we know it. There have been plenty of popular science books about the big bang and the origins of the universe (including my own Before the Big Bang ) but this is unique. In part this is because it's bang up to date (so to speak), but more so because rather than present the theories in an approachable fashion, the book dives into the (sometimes extremely heated) disputed debates between theoreticians. It's still popular science as there's no maths, but it gives a real insight into the alternative viewpoints and depth of feeling. We begin with a rapid dash through the history of cosmological ideas, passing rapidly through the steady state/big bang debate (though not covering Hoyle's modified steady state that dealt with the 'early universe' issues), then slow down as we get into the various possibilities that would emerge once inflation arrived on the scene (including, of course, the theories that do away with inflation). ...