Skip to main content

Who Invented the Computer? – Ian Watson

Ian Watson is the author of The Universal Machine and featured in a recent Four Way Interview.
Saturday June 23 2012 was the centenary of the birth of Alan Turing, the troubled genius who invented the modern computer. Why though do so few people recognize his name and his great achievements?
In 1936 English mathematician Alan Turing, published a paper, On Computable Numbers, with an application to the Entscheidungsproblem. This became the foundation of computing. In it Turing presented a theoretical machine that could solve any problem that could be described by instructions encoded on a paper tape. A Turing Machine could calculate square roots, whilst another might solve Sudoku puzzles. Turing demonstrated you could construct a single Universal Machine that could simulate any Turing Machine. One machine solving any problem for which a program could be written – sound familiar? He’d invented the computer.
Then, computers were people who did calculations. As the Allies prepared for WWII they faced a shortage of computers for military calculations. When men left for war the shortage got worse so the US mechanized the problem building the Harvard Mark 1; it could do calculations in seconds that took a person hours. The British also needed mathematicians to crack the Nazi’s Enigma code.
Turing worked at Bletchley Park, perhaps better know as “Station X,” where code-breaking became an industrial process; 12,000 people working 24/7. Although the Polish had cracked Enigma before the war the Nazis had made Enigma more complicated; there were 10114 permutations. Turing designed a machine, called the Bombe, that searched through the permutations and by war’s end the British were reading all Enigma traffic. Historians agree that Turing shortened the war by as much as two years and Churchill would later say that Turing had made the single biggest contribution to Allied victory in the war.
As the 1950s progressed business was quick to use computers and as the technology advanced business computing became an industry. These computers were all universal machines – you could program them to do anything.
There will positively be no internal alteration [of the computer] to be made even if we wish suddenly to switch from calculating the energy levels of the neon atom to the enumeration of groups of order 720. It may appear somewhat puzzling that this can be done. How can one expect a machine to do all this multitudinous variety of things? The answer is that we should consider the machine to be doing something quite simple, namely carrying out orders given to it in a standard form which it is able to understand. – Alan Turing
By the 1970s a generation was born who grew up with “electronic brains;” they wanted their own personal computers. The problem was they had to build them. In 1975 a college dropout called Steve Wozniak built a simple computer around the 8080 microprocessor, which he hooked up to a keyboard and TV. His friend, Steve Jobs, called it the Apple I, and found a Silicon Valley shop that would buy 100 for $500 each. Apple had its first sale and Silicon Valley’s start-up culture was born. Another dropout, Bill Gates, realized that PCs needed software and that people would pay for it – Microsoft would sell them programs.
Turing had another vision, one day computers would think? But, how would you know a computer was intelligent? He devised the Turing Test; a judge sitting at a computer terminal types questions to two entities: a person and a computer. The judge decides which entity is human. If the judge is wrong the computer passes the test and is intelligent.
Artificial intelligence (AI) is entering your daily life. Car satnavs and Google search use AI, Apple’s iPhone can understand your voice and intelligently respond, car manufacturers are developing autonomous cars. Turing’s vision of AI will soon be a reality.
In 1952 Turing was prosecuted for being gay and was sentenced to chemical castration. This caused depression and he committed suicide by eating an apple he’d poisoned. Outside of academia Turing remained virtually unknown because his WWII work was top secret. Slowly word of Turing’s genius spread; in 1999 Time Magazine named him as one of the “100 Most Important People of the 20th Century,” stating: “The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine.” and in 2009 the British Prime Minister issued a public apology:
…on behalf of the British government, and all those who live freely thanks to Alan’s work, I am very proud to say: we’re sorry. You deserved so much better.
Finally Alan Turing is getting the recognition he deserves for inventing the computer, his Universal Machine that has transformed our world and will profoundly influence our futures.

Comments

Popular posts from this blog

God: the Science, the Evidence - Michel-Yves Bolloré and Olivier Bonnassies ***

This is, to say the least, an oddity, but a fascinating one. A translation of a French bestseller, it aims to put forward an examination of the scientific evidence for the existence of a deity… and various other things, as this is a very oddly structured book (more on that in a moment). In The God Delusion , Richard Dawkins suggested that we should treat the existence of God as a scientific claim, which is exactly what the authors do reasonably well in the main part of the book. They argue that three pieces of scientific evidence in particular are supportive of the existence of a (generic) creator of the universe. These are that the universe had a beginning, the fine tuning of natural constants and the unlikeliness of life.  To support their evidence, Bolloré and Bonnassies give a reasonable introduction to thermodynamics and cosmology. They suggest that the expected heat death of the universe implies a beginning (for good thermodynamic reasons), and rightly give the impression tha...

The Infinite Alphabet - Cesar Hidalgo ****

Although taking a very new approach, this book by a physicist working in economics made me nostalgic for the business books of the 1980s. More on why in a moment, but Cesar Hidalgo sets out to explain how it is knowledge - how it is developed, how it is managed and forgotten - that makes the difference between success and failure. When I worked for a corporate in the 1980s I was very taken with Tom Peters' business books such of In Search of Excellence (with Robert Waterman), which described what made it possible for some companies to thrive and become huge while others failed. (It's interesting to look back to see a balance amongst the companies Peters thought were excellent, with successes such as Walmart and Intel, and failures such as Wang and Kodak.) In a similar way, Hidalgo uses case studies of successes and failures for both businesses and countries in making effective use of knowledge to drive economic success. When I read a Tom Peters book I was inspired and fired up...

The War on Science - Lawrence Krauss (Ed.) ****

At first glance this might appear to be yet another book on how to deal with climate change deniers and the like, such as How to Talk to a Science Denier.   It is, however, a much more significant book because it addresses the way that universities, government and pressure groups have attempted to undermine the scientific process. Conceptually I would give it five stars, but it's quite heavy going because it's a collection of around 18 essays by different academics, with many going over the same ground, so there is a lot of repetition. Even so, it's an important book. There are a few well-known names here - editor Lawrence Krauss, Richard Dawkins and Steven Pinker - but also a range of scientists (with a few philosophers) explaining how science is being damaged in academia by unscientific ideas. Many of the issues apply to other disciplines as well, but this is specifically about the impact on science, and particularly important there because of the damage it has been doing...