Since the lead up to the Turing centennial in 2012 celebrating the birth of one of the great meta-mathematicians of the twentieth century, Alan Mathison Turing, I have observed with increasing horror the escalating hagiographic accounts of Turing’s undoubted historical achievements and the resulting perversion of the histories of twentieth-century science, mathematics and technology and in particular the history of computing.

This abhorrence on my part is not based on a mere nodding acquaintance with Turing’s name but on a deep and long-time engagement with the man and his work. I served my apprenticeship as a historian of science over many years in a research project on the history of formal or mathematical logic. Formal logic is one of the so-called formal sciences the others being mathematics and informatics (or computer science). I have spent my whole life studying the history of mathematics with a special interest in the history of computing both in its abstract form and in its technological realisation in all sorts of calculating aids and machines. I also devoted a substantial part of my formal study of philosophy to the study of the philosophy of mathematics and the logical, meta-logical and meta-mathematical problems that this discipline, some would say unfortunately, generates. The history of all of these intellectual streams flow together in the first half of the twentieth century in the work of such people as Leopold Löwenheim, Thoralf Skolem, Emil Post, Alfred Tarski, Kurt Gödel, Alonso Church and Alan Turing amongst others. These people created a new discipline known as meta-mathematics whilst carrying out a programme delineated by David Hilbert.

Attempts to provide a solid foundation for mathematics using set theory and logic had run into serious problems with paradoxes. Hilbert thought the solution lay in developing each mathematical discipline as a strict axiomatic systems and then proving that each axiomatic system possessed a set of required characteristics thus ensuring the solidity and reliability of a given system. This concept of proving theories for complete axiomatic systems is the meta- of meta-mathematics. The properties that Hilbert required for his axiomatic systems were consistency, which means the systems should be shown to be free of contradictions, completeness, meaning that all of the theorems that belong to a particular discipline are deductible from its axiom system, and finally decidability, meaning that for any well-formed statement within the system it should be possible to produced an algorithmic process to decide if the statement is true within the axiomatic system or not. An algorithm is like a cookery recipe if you follow the steps correctly you will produce the right result.

The meta-mathematicians listed above showed by very ingenious methods that none of Hilbert’s aims could be fulfilled bringing the dream of a secure foundation for mathematics crashing to the ground. Turing’s solution to the problem of decidability is an ingenious thought experiment, for which he is justifiably regarded as one of the meta-mathematical gods of the twentieth century. It was this work that led to him being employed as a code breaker at Bletchley Park during WW II and eventually to the fame and disaster of the rest of his too short life.

Unfortunately the attempts to restore Turing’s reputation since the centenary of his birth in 2012 has led to some terrible misrepresentations of his work and its consequences. I thought we had reach a low point in the ebb and flow of the centenary celebrations but the release of “The Imitation Game”, the Alan Turing biopic, has produced a new series of false and inaccurate statements in the reviews. I was pleasantly pleased to see several reviews, which attempt to correct some of the worst historical errors in the film. You can read a collection of reviews of the film in the most recent edition of the weekly histories of science, technology and medicine links list Whewell’s Gazette. Not having seen the film yet I can’t comment but I was stunned when I read the following paragraph from the abc NEWS review of the film written by Alyssa Newcomb. It’s so bad you can only file it under; you can’t make this shit up.

Before I analyse this train wreck of a historical statement I would just like to emphasise that this is not the Little Piddlington School Gazette, whose enthusiastic but slightly slapdash twelve-year-old film critic got his facts a little mixed up, but a review that appeared on the website of a major American media company and as such totally unacceptable however you view it.

The first compound statement contains a double whammy of mega-inane falsehood and I had real problems deciding where to begin and finally plumped for the “first modern computer to logically process information, running on interchangeable software”. Alan Turing had nothing to do with the first such machine, the honour going to Konrad Zuse’s Z3, which Zuse completed in 1941. The first such machine in whose design and construction Alan Turing was involved was the ACE produced at the National Physical Laboratory, in London, in 1949. In the intervening years Atanasoff and Berry, Tommy Flowers, Howard Aikin, as well as Eckert and Mauchly had all designed and constructed computers of various types and abilities. To credit Turing with the sole responsibility for our digital computer age is not only historically inaccurate but also highly insulting to all the others who made substantial and important contributions to the evolution of the computer. Many, many more than I’ve named here.

We now turn to the second error contained in this wonderfully inane opening statement and return to the subject of meta-mathematics. The “Turing Machine” is not a computer at all its Alan Turing’s truly genial thought experiment solution to Hilbert’s decidability problem. Turing imagined a very simple machine that consists of a scanning-reading head and an infinite tape that runs under the scanning head. The head can read instructions on the tape and execute them, moving the tape right or left or doing nothing. The question then reduces to the question, which set of instructions on the tape come eventually to a stop (decidable) and which lead to an infinite loop (undecidable). Turing developed this idea to a machine capable of computing any computable function (a universal Turing Machine) and thus created a theoretical model for all computers. This is of course a long way from a practical, real mechanical realisation i.e. a computer but it does provide a theoretical measure with which to describe the capabilities of a mechanical computing device. A computer that is the equivalent of a Universal Turing Machine is called Turing complete. For example, Zuse’s Z3 was Turing complete whereas Colossus, the computer designed and constructed by Tommy Flowers for decoding work at Bletchley Park, was not.

Turing’s work played and continues to play an important role in the theory of computation but historically had very little effect on the development of real computers. Attributing the digital computer age to Turing and his work is not just historically wrong but is as I already stated above highly insulting to all of those who really did bring about that age. Turing is a fascinating, brilliant, and because of what happened to him because of the persecution of homosexuals, tragic figure in the histories of mathematics, logic and computing in the twentieth century but attributing achievements to him that he didn’t make does not honour his memory, which certainly should be honoured, but ridicules it.

I should in fairness to the author of the film review that I took as motivation from this post that she seems to be channelling misinformation from the film distributors as I’ve read very similar stupid claims in other previews and reviews of the film.