Mega inanity

Since the lead up to the Turing centennial in 2012 celebrating the birth of one of the great meta-mathematicians of the twentieth century, Alan Mathison Turing, I have observed with increasing horror the escalating hagiographic accounts of Turing’s undoubted historical achievements and the resulting perversion of the histories of twentieth-century science, mathematics and technology and in particular the history of computing.

This abhorrence on my part is not based on a mere nodding acquaintance with Turing’s name but on a deep and long-time engagement with the man and his work. I served my apprenticeship as a historian of science over many years in a research project on the history of formal or mathematical logic. Formal logic is one of the so-called formal sciences the others being mathematics and informatics (or computer science). I have spent my whole life studying the history of mathematics with a special interest in the history of computing both in its abstract form and in its technological realisation in all sorts of calculating aids and machines. I also devoted a substantial part of my formal study of philosophy to the study of the philosophy of mathematics and the logical, meta-logical and meta-mathematical problems that this discipline, some would say unfortunately, generates. The history of all of these intellectual streams flow together in the first half of the twentieth century in the work of such people as Leopold Löwenheim, Thoralf Skolem, Emil Post, Alfred Tarski, Kurt Gödel, Alonso Church and Alan Turing amongst others. These people created a new discipline known as meta-mathematics whilst carrying out a programme delineated by David Hilbert.

Attempts to provide a solid foundation for mathematics using set theory and logic had run into serious problems with paradoxes. Hilbert thought the solution lay in developing each mathematical discipline as a strict axiomatic systems and then proving that each axiomatic system possessed a set of required characteristics thus ensuring the solidity and reliability of a given system. This concept of proving theories for complete axiomatic systems is the meta- of meta-mathematics. The properties that Hilbert required for his axiomatic systems were consistency, which means the systems should be shown to be free of contradictions, completeness, meaning that all of the theorems that belong to a particular discipline are deductible from its axiom system, and finally decidability, meaning that for any well-formed statement within the system it should be possible to produced an algorithmic process to decide if the statement is true within the axiomatic system or not. An algorithm is like a cookery recipe if you follow the steps correctly you will produce the right result.

The meta-mathematicians listed above showed by very ingenious methods that none of Hilbert’s aims could be fulfilled bringing the dream of a secure foundation for mathematics crashing to the ground. Turing’s solution to the problem of decidability is an ingenious thought experiment, for which he is justifiably regarded as one of the meta-mathematical gods of the twentieth century. It was this work that led to him being employed as a code breaker at Bletchley Park during WW II and eventually to the fame and disaster of the rest of his too short life.

Unfortunately the attempts to restore Turing’s reputation since the centenary of his birth in 2012 has led to some terrible misrepresentations of his work and its consequences. I thought we had reach a low point in the ebb and flow of the centenary celebrations but the release of “The Imitation Game”, the Alan Turing biopic, has produced a new series of false and inaccurate statements in the reviews. I was pleasantly pleased to see several reviews, which attempt to correct some of the worst historical errors in the film. You can read a collection of reviews of the film in the most recent edition of the weekly histories of science, technology and medicine links list Whewell’s Gazette. Not having seen the film yet I can’t comment but I was stunned when I read the following paragraph from the abc NEWS review of the film written by Alyssa Newcomb. It’s so bad you can only file it under; you can’t make this shit up.

The “Turing Machine” was the first modern computer to logically process information, running on interchangeable software and essentially laying the groundwork for every computing device we have today — from laptops to smartphones.

Before I analyse this train wreck of a historical statement I would just like to emphasise that this is not the Little Piddlington School Gazette, whose enthusiastic but slightly slapdash twelve-year-old film critic got his facts a little mixed up, but a review that appeared on the website of a major American media company and as such totally unacceptable however you view it.

The first compound statement contains a double whammy of mega-inane falsehood and I had real problems deciding where to begin and finally plumped for the “first modern computer to logically process information, running on interchangeable software”. Alan Turing had nothing to do with the first such machine, the honour going to Konrad Zuse’s Z3, which Zuse completed in 1941. The first such machine in whose design and construction Alan Turing was involved was the ACE produced at the National Physical Laboratory, in London, in 1949. In the intervening years Atanasoff and Berry, Tommy Flowers, Howard Aikin, as well as Eckert and Mauchly had all designed and constructed computers of various types and abilities. To credit Turing with the sole responsibility for our digital computer age is not only historically inaccurate but also highly insulting to all the others who made substantial and important contributions to the evolution of the computer. Many, many more than I’ve named here.

We now turn to the second error contained in this wonderfully inane opening statement and return to the subject of meta-mathematics. The “Turing Machine” is not a computer at all its Alan Turing’s truly genial thought experiment solution to Hilbert’s decidability problem. Turing imagined a very simple machine that consists of a scanning-reading head and an infinite tape that runs under the scanning head. The head can read instructions on the tape and execute them, moving the tape right or left or doing nothing. The question then reduces to the question, which set of instructions on the tape come eventually to a stop (decidable) and which lead to an infinite loop (undecidable). Turing developed this idea to a machine capable of computing any computable function (a universal Turing Machine) and thus created a theoretical model for all computers. This is of course a long way from a practical, real mechanical realisation i.e. a computer but it does provide a theoretical measure with which to describe the capabilities of a mechanical computing device. A computer that is the equivalent of a Universal Turing Machine is called Turing complete. For example, Zuse’s Z3 was Turing complete whereas Colossus, the computer designed and constructed by Tommy Flowers for decoding work at Bletchley Park, was not.

Turing’s work played and continues to play an important role in the theory of computation but historically had very little effect on the development of real computers. Attributing the digital computer age to Turing and his work is not just historically wrong but is as I already stated above highly insulting to all of those who really did bring about that age. Turing is a fascinating, brilliant, and because of what happened to him because of the persecution of homosexuals, tragic figure in the histories of mathematics, logic and computing in the twentieth century but attributing achievements to him that he didn’t make does not honour his memory, which certainly should be honoured, but ridicules it.

I should in fairness to the author of the film review, that I took as motivation from this post, say that she seems to be channelling misinformation from the film distributors as I’ve read very similar stupid claims in other previews and reviews of the film.

16 Comments

Filed under History of Computing, History of Logic, History of Mathematics, Myths of Science

16 responses to “Mega inanity

  1. Jeb

    “the attempts to restore Turing’s reputation”
    I suspect that makes him a high value narrative target, after listening to a ranting (which from weary experience and a touch of personal bias I suspect may have been a physicist) scientist on the repeal of Turing’s conviction which appeared to be presenting science as holding the shield of truth justice, social equality the protection of innocents while wielding the usual weary weapon to smite the ignorant.

    It reminded me rather strongly of the way some skeptics deal with the subject of women and witchcraft.

    Tall tale tellers dream, tie up a lot of time in a simple and repetitive narrative format, allows total focus on the action parts where the brave scientist saves humanity from evil demons, time and time again.

    Who needs Star Wars 7 when we have this.

    • abscd

      do you understand what did you say here?

      • Jeb

        Standard ploy used in nakedly political arguments. Moms apple pie, cute puppy dogs, war veterans, persecuted gay men who are then contrasted with unsympathetic figures, for example business men, Jews, the state the church etc.

        Slots nicely in to a wider story in which sciences overcomes a series of trails in which it faces persecution from a multi headed creature of un reason.

        Gives it a monopoly on skepticism and the empirical enterprise (whats the point in being a skeptic if everyone else is).

  2. jimhexis

    Over and beyond their ideological utility and emotional appeal, the narrative cliches have another huge advantage over serious history. The public mind just doesn’t have much closet space. It’s hard for professional historians or even people like me who just read a lot to keep in mind how little even supposedly educated people know about the past, particularly the scientific past, or how little patience they have for accounts that go beyond edifying anecdotes or scandalous tales. As you point out, it’s simply not true that Turing was responsible for the computer; but like all of the other Father of X stories that irritate you so much, it’s a mighty economical proposition. Which is one uncomplicated reason the public goes on believing such things and the media and even teachers and textbook writers go on repeating ’em, even when they know better.

  3. Walter Hehl

    There are two personalities I admire in computer history: Alan Turing and Konrad Zuse. My criterion is that that you more you learn about them, the more you are astonished about their foresight and independent thoughts.
    Do read Alan Turing’s easy-to-read paper from 1950 “Computing Machinery and Intelligence” – valid still today for any discussion with ennemies or deniers of AI. Of course, pure vision, but really great,.1950!
    Have a look at the working of a Zuse machine Z3 running with the noise of a 3 Hz-clock frequency (Videos available on Youtube). He had several concepts of a digital computer created and made running himself! Indepedent and against his surrounding local Zeitgeist (he did not even get a German patent because it was rated as “not sufficient innovative”). And he had also philosophical ideas as the “thinking space” that everything should be (or is) digital.
    Alan as mathematician-theoretician and Konrad as engineer-practioner are a great couple. Their values and contributions are relatively easy to see; it is much more difficult for the 3rd personality in the context of computer history PR: Ada Lovelace.
    But Ada was, in her own words, mainly a “poetic scientist”. I like her for phrases like this: “(computers) weave thoughts like a Jacquard loom weaves flowers and leaves”. And for her energy as rich and noble lay-woman to deal with mathematics without academical background.

    • Roger McDermott

      What is an enemy or a denier of AI?

      • Walter Hehl

        Although most people believe that the brain is a kind of computer, many (incl. some philosophers) think that human thinking is something genuinely different: e. g. only humans can “really” understand or “really” be conscious. This attitude is questionable, see this quote from Wikipedia:
        ‘Douglas Hofstadter expresses the AI effect concisely by quoting Tesler’s Theorem: “AI is whatever hasn’t been done yet.” ‘.

        The definition of AI (and of deniers and opponents) is floating with time – what has been achieved is forgotten (and trivialized), what not is “principally impossible” for some [I call deniers]… Would be interesting to see your horizon for the limits of AI!.
        btw, isn’t it somehow more astonishing that the brain can think with “meat” and “slime” than a nice computer with clean silicon circuits and sharp instructions?

  4. Pingback: Whewell’s Gazette: Vol. #24 | Whewell's Ghost

  5. Phillip Helbig

    Your last paragraph is missing a verb.

    .

  6. araybold

    Turing’s achievements are difficult to summarize because of the abstract nature of his machine, the problem it addressed, and its relationship to actual computers. I am sure you are aware that your own summary, presented here, contains simplifications that a suitably pedantic person could take issue with.

    The quoted sentence, which is, indeed, truly inane, contains the faintest outlines of the concept of Turing equivalence, but the author clearly has no idea what that is, and almost certainly does not know of it at all. I imagine this is the end of a chain of simplifications, each stripping away more facts until none remain.

    If it wasn’t for Turing’s undeniably important war work, it is unlikely that the public would be aware of him (I don’t expect to see a general-release movie featuring Post or Church.) The fact that this work involved computing machines (though not Turing-equivalent ones) further muddies the waters.

  7. What about the US Navy’s Torpedo Data Computer? An original model, at least, came out before Zuse, even if not fully electronic.

    • Not really a computer in the sense of the quote. There are also earlier special computing devices such as Vannevar Bush’s differential analyzer from 1927, which inspired the efforts of Atanasoff and Berry, Howard Aiken, and Eckert and Mauchley.

  8. Pingback: Computer History: Links And Resources (18) | Angel "Java" Lopez on Blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s