Skip to main content

Who Invented the Computer? – Ian Watson

Ian Watson is the author of The Universal Machine and featured in a recent Four Way Interview.
Saturday June 23 2012 was the centenary of the birth of Alan Turing, the troubled genius who invented the modern computer. Why though do so few people recognize his name and his great achievements?
In 1936 English mathematician Alan Turing, published a paper, On Computable Numbers, with an application to the Entscheidungsproblem. This became the foundation of computing. In it Turing presented a theoretical machine that could solve any problem that could be described by instructions encoded on a paper tape. A Turing Machine could calculate square roots, whilst another might solve Sudoku puzzles. Turing demonstrated you could construct a single Universal Machine that could simulate any Turing Machine. One machine solving any problem for which a program could be written – sound familiar? He’d invented the computer.
Then, computers were people who did calculations. As the Allies prepared for WWII they faced a shortage of computers for military calculations. When men left for war the shortage got worse so the US mechanized the problem building the Harvard Mark 1; it could do calculations in seconds that took a person hours. The British also needed mathematicians to crack the Nazi’s Enigma code.
Turing worked at Bletchley Park, perhaps better know as “Station X,” where code-breaking became an industrial process; 12,000 people working 24/7. Although the Polish had cracked Enigma before the war the Nazis had made Enigma more complicated; there were 10114 permutations. Turing designed a machine, called the Bombe, that searched through the permutations and by war’s end the British were reading all Enigma traffic. Historians agree that Turing shortened the war by as much as two years and Churchill would later say that Turing had made the single biggest contribution to Allied victory in the war.
As the 1950s progressed business was quick to use computers and as the technology advanced business computing became an industry. These computers were all universal machines – you could program them to do anything.
There will positively be no internal alteration [of the computer] to be made even if we wish suddenly to switch from calculating the energy levels of the neon atom to the enumeration of groups of order 720. It may appear somewhat puzzling that this can be done. How can one expect a machine to do all this multitudinous variety of things? The answer is that we should consider the machine to be doing something quite simple, namely carrying out orders given to it in a standard form which it is able to understand. – Alan Turing
By the 1970s a generation was born who grew up with “electronic brains;” they wanted their own personal computers. The problem was they had to build them. In 1975 a college dropout called Steve Wozniak built a simple computer around the 8080 microprocessor, which he hooked up to a keyboard and TV. His friend, Steve Jobs, called it the Apple I, and found a Silicon Valley shop that would buy 100 for $500 each. Apple had its first sale and Silicon Valley’s start-up culture was born. Another dropout, Bill Gates, realized that PCs needed software and that people would pay for it – Microsoft would sell them programs.
Turing had another vision, one day computers would think? But, how would you know a computer was intelligent? He devised the Turing Test; a judge sitting at a computer terminal types questions to two entities: a person and a computer. The judge decides which entity is human. If the judge is wrong the computer passes the test and is intelligent.
Artificial intelligence (AI) is entering your daily life. Car satnavs and Google search use AI, Apple’s iPhone can understand your voice and intelligently respond, car manufacturers are developing autonomous cars. Turing’s vision of AI will soon be a reality.
In 1952 Turing was prosecuted for being gay and was sentenced to chemical castration. This caused depression and he committed suicide by eating an apple he’d poisoned. Outside of academia Turing remained virtually unknown because his WWII work was top secret. Slowly word of Turing’s genius spread; in 1999 Time Magazine named him as one of the “100 Most Important People of the 20th Century,” stating: “The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine.” and in 2009 the British Prime Minister issued a public apology:
…on behalf of the British government, and all those who live freely thanks to Alan’s work, I am very proud to say: we’re sorry. You deserved so much better.
Finally Alan Turing is getting the recognition he deserves for inventing the computer, his Universal Machine that has transformed our world and will profoundly influence our futures.

Comments

Popular posts from this blog

The Infinity Machine - Sebastian Mallaby ****

It's very quickly clear that Sebastian Mallaby is a huge Demis Hassabis fan - writing about the only child prodigy and teen genius ever who was also a nice, rounded personality. After a few chapters, though, things settle down (I'm reminded of Douglas Adams' description of the Hitchhiker's Guide to the Galaxy ) and we get a good, solid trip through the journey that gave us DeepMind, their AlphaGo and AlphaFold programs, the sudden explosion of competition on the AI front and thoughts on artificial general intelligence. Although Mallaby does occasionally still go into fan mode - reading this you would think that AlphaFold had successfully perfectly predicted the structure of every protein, where it is usually not sufficiently accurate for its results to have direct practical application - we get a real feel for the way this relatively unusual company was swiftly and successfully developed away from Silicon Valley. It's readable and gives an important understanding of...

In Seach of Sea Dragons - Matthew Myerscough ****

It's common advice to would-be authors of narrative non-fiction to open with something dramatic - Matthew Myerscough certainly does this with the story of his being trapped under an avalanche on Snowdon (while his girlfriend, also carried away remains on top of the snow unhurt). It certainly is dramatic, but seemed entirely disconnected from the reason I got the book, which was to read about fossil collecting.  Luckily, though, in the second chapter we get into a more conventional 'how I got interested in fossils as a boy'. Having recently reviewed Patrick Moore's autobiography and noting that astronomy was one of the few sciences where amateurs can still make a contribution, it came to mind that palaeontology is another - Myerscough is a civil engineer by trade, but just as amateur astronomers can find new details in the skies, so amateur fossil hunters have been searching for these relics for centuries. When I give talks in junior schools, the two topics that guarant...

Robot-Proof - Vivienne Ming ****

As Vivienne Ming makes apparent, there seem largely to be two views of AI's pros and cons, both of which are almost certainly wrong. It's either doom-saying 'It'll destroy life as we know it' or Pollyanna-ish 'It'll do all the boring work and we can all be wonderfully creative and live lives of leisure.' Instead, Ming gives us a clear analysis of the likely trajectory for the workplace, particularly for the IT industry. She describes three 'equally flawed, intellectually lazy strategies' to deal with the impact of AI. The first is substitution and deprofessionalisation, using AI to allow cheaper 'AI-augmented technicians' to replace more expensive professionals, producing more low wage jobs and fewer mid-range. This does save money but leaves a company at risk of being easily outcompeted. The second is what Ming describes as the '"A-Player" Hunger Games', the approach favoured by Silicon Valley. This sees the growing rif...