Category Archives: History of Computing

A double bicentennial – George contra Ada – Reality contra Perception

The end of this year sees a double English bicentennial in the history of computing. On 2 November we celebrate the two hundredth anniversary of the birth of mathematician and logician Georg Boole then on 10 December the two hundredth anniversary of the birth of ‘science writer’ Augusta Ada King, Countess of Lovelace. It is an interesting exercise to take a brief look at how these two bicentennials are being perceived in the public sphere.

As I have pointed out in several earlier posts Ada was a member of the minor aristocracy, who, although she never knew her father, had a wealthy well connected mother. She had access to the highest social and intellectual circles of early Victorian London. Despite being mentored and tutored by the best that London had to offer she failed totally in mastering more than elementary mathematics. So, as I have also pointed out more than once, to call her a mathematician is a very poor quality joke. Her only ‘scientific’ contribution was to translate a memoire on Babbage’s Analytical Engine from French into English to which are appended a series of new notes. There is very substantial internal and external evidence that these notes in fact stem from Babbage and not Ada and that she only gave them linguistic form. What we have here is basically a journalistic interview and not a piece of original work. It is a historical fact that she did not write the first computer programme, as is still repeated ad nauseam every time her name is mentioned.

However the acolytes of the Cult of the Holy Saint Ada are banging the advertising drum for her bicentennial on a level comparable to that accorded to Einstein for the centenary of the General Theory of Relativity. On social media ‘Finding Ada’ are obviously planning massive celebrations, which they have already indicated although the exact nature of them has yet to be revealed. More worrying is the publication of the graphic novel The Thrilling Adventures of Lovelace and Babbage: The (Mostly) True Story of the First Computer (note who gets first billing!) by animator and cartoonist Sydney Padua. The Analytical Engine as of course not the first computer that honour goes to Babbage’s Difference Engine. More important Padua’s novel is not even remotely ‘mostly’ true but largely fictional. This wouldn’t matter that much if said book had not received major media attention. Attention that compounded the error by conveniently forgetting the mostly. The biggest lie in the work of fiction is the claim that Ada was somehow directly involved in the conception and construction of the Analytical engine. In reality she had absolutely nothing to do with either its conception or its construction.

This deliberate misconception has been compounded by a, in social media widely disseminated, attempt to get support for a Lovelace, Babbage Analytical Engine Lego Set. The promoter of this enterprise has written in his blurb:

Ada Lovelace (1815-1852) is widely credited as the first computer scientist and Charles Babbage (1791-1871) is best remembered for originating the concept of a programmable computer. Together they collaborated on Babbage’s early mechanical general-purpose computer, the Analytical Engine.

Widely credited by whom? If anybody is the first computer scientist in this set up then it’s Babbage. Others such as Leibniz speculated on what we now call computer science long before Ada was born so I think that is another piece of hype that we can commit to the trashcan. Much more important is the fact that they did not collaborate on the Analytical Engine that was solely Babbage’s baby. This factually false hype is compounded in the following tweet from 21 July, which linked to the Lego promotion:

Historical lego [sic] of Ada Lovelace’s conception of the first programmable computer

To give some perspective to the whole issue it is instructive to ask about what in German is called the ‘Wirkungsgeschichte’, best translated as historical impact, of Babbage’s efforts to promote and build his computers, including the, in the mean time, notorious Menabrea memoire, irrespective as to who actually formulated the added notes. The impact of all of Babbage’s computer endeavours on the history of the computer is almost nothing. I say almost because, due to Turing, the notes did play a minor role in the early phases of the post World War II artificial intelligence debate. However one could get the impression from the efforts of the Ada Lovelace fan club, strongly supported by the media that this was a highly significant contribution to the history of computing that deserves to be massively celebrated on the Lovelace bicentennial.

Let us now turn our attention to subject of our other bicentennial celebration, George Boole. Born into a working class family in Lincoln, Boole had little formal education. However his father was a self-educated man with a thirst for knowledge, who instilled the same characteristics in his son. With some assistance he taught himself Latin and Greek and later French, German and Italian in order to be able to read the advanced continental mathematics. His father went bankrupt when he was 16 and he became breadwinner for the family, taking a post as schoolmaster in a small private school. When he was 19 he set up his own small school. Using the library of the local Mechanics Institute he taught himself mathematics. In the 1840s he began to publish original mathematical research in the Cambridge Mathematical Journal with the support of Duncan Gregory, a great great grandson of Newton’s contemporary James Gregory. Boole went on to become one of the leading British mathematicians of the nineteenth century and despite his total lack of formal qualifications he was appointed Professor of Mathematics at the newly founded Queen’s College of Cork in 1849.

Although a fascinating figure in the history of mathematics it is Boole the logician, who interests us here. In 1847 Boole published the first version of his logical algebra in the form of a largish pamphlet, Mathematical Analysis of Logic. This was followed in 1854 by an expanded version of his ideas in his An Investigation of the Laws of Thought, on which are founded the Mathematical Theories of Logic and Probability. These publications contain the core of Boolean algebra, the final Boolean algebra was actually produced by Stanley Jevons, only the second non-standard algebra ever to be developed. The first non-standard algebra was Hamilton’s quaternions. For non-mathematical readers standard algebra is the stuff we all learned (and loved!) at school. Boolean algebra was Boole’s greatest contribution to the histories of mathematics, logic and science.

When it first appeared Boole’s logic was large ignored as an irrelevance but as the nineteenth century progressed it was taken up and developed by others, most notably by the German mathematician Ernst Schröder, and provided the tool for much early work in mathematical logic. Around 1930 it was superseded in this area by the mathematical logic of Whitehead’s and Russell’s Principia Mathematica. Boole’s algebraic logic seemed destined for the novelty scrap heap of history until a brilliant young American mathematician wrote his master’s thesis.

Claude Shannon (1916–2001) was a postgrad student of electrical engineering of Vannevar Bush at MIT working on Bush’s electro-mechanical computer the differential analyzer. Having learnt Boolean algebra as an undergraduate Shannon realised that it could be used for the systematic and logical design of electrical switching circuits. In 1937 he published a paper drawn from his master’s thesis, A Symbolic Analysis of Relay and Switching Circuits. Shannon switching algebra, applied Boolean algebra, would go on to supply the basis of the hardware design of all modern computers. When people began to write programs for the computers designed with Shannon’s switching algebra it was only natural that they would use Boole’s two-valued (1/0, true/false, on/off) algebra to write those programs. Almost all modern computers are both in their hardware and there software applied Boolean algebra. One can argue, as I have actually done somewhat tongue in cheek in a lecture, that George Boole is the ‘father’ of the modern computer. (Somewhat tongue in cheek, as I don’t actually like the term ‘father of’). The modern computer has of course many fathers and mothers.

In George Boole, as opposed to Babbage and Lovelace, we have a man whose work made a massive real contribution to history of the computer and although both the Universities of Cork and Lincoln are planning major celebration for his bicentennial they have been, up till now largely ignored by the media with the exception of the Irish newspapers who are happy to claim Boole, an Englishman, as one of their own.

The press seems to have decided that a ‘disadvantaged’ (she never was, as opposed to Boole) female ‘scientist’, who just happens to be Byron’s daughter is more newsworthy in the history of the computer than a male mathematician, even if she contributed almost nothing and he contributed very much.


Filed under History of Computing, History of Mathematics, Ladies of Science, Myths of Science

Creating a holy cow.

Whenever I think that the deification of Ada Lovelace can’t get anymore ridiculous somebody comes along and ups the ante. The latest idiocy was posted on Twitter by the comedian Stephen Fry (of whom I’m a big fan!). Mr Fry tweeted:

Ada Lovelace & Alan Turing for the next £20 note! Nominate here [link removed] Heroic pioneers in the face of prejudice. [my emphasis]

My comments will only concern Augusta Ada King, Countess of Lovelace, although the comment I have highlighted also has issues when applied to Alan Turing.

Heroic pioneers in the face of prejudice. Let us briefly examine the prejudice that the Countess of Lovelace, née Byron, suffered. Born into the English aristocracy she unfortunately lost her “mad, bad and dangerous to know” father at the tender age of one month. However her mother’s family were extremely wealthy, the main reason Byron who was destitute had married her, and so Ada lacked for nothing throughout her childhood. It should be also pointed out that her mother enjoyed a very high social status, despite her disastrous marriage.

She was, as a young women, tutored and mentored by the elite of the scientific community in Victorian London, including Charles Babbage, Augustus De Morgan, Sir Charles Wheatstone and Mary Somerville, all of whom helped and encouraged her in her scientific studies. She married the wealthy Baron William King who was soon elevated to Earl of Lovelace and who also supported her scientific endeavours without any restrictions. Somehow I fail to see to what the term prejudice could possibly be referring. Rich, pampered and supported by the very elite of London’s scientific community doesn’t sound like prejudice to me.

It was Wheatstone who suggested that she translate the Menabrea memoire on the Analytical Engine in emulation of her mentor Mary Somerville’s translation of Laplace, a far greater and much more complex work. So there is no suggestion of the pioneer here. Somerville herself was just one of several women, albeit the greatest, who wrote works popularizing the mathematical sciences in England in the first half of the nineteenth century. So Ada was in no way a pioneer but rather following the crowd.

It might be argued that her notations to the memoire qualify her as a pioneer, however I remain firmly convinced that the notes were very much a Babbage-Lovelace co-production with Babbage providing the content and Lovelace the turns of phrase. At best she was a scientific journalist or communicator. The pioneer was Babbage. There is strong evidence to support this interpretation, which gets swept under the carpet by the acolytes of the Cult of the Holy Saint Ada.

I shall be writing a longer post on one central aspect of the cult’s mythologizing later in the summer so stayed tuned.


Filed under History of Computing, Myths of Science

The worst history of technology headline of the year?

The Guardian website produced a couple of articles to announce the publication of Sydney Padua’s graphic novel, The Thrilling Adventures of Lovelace and Babbage: The (Mostly) True Story of the First Computer. I strongly suspect that despite Padua’s qualifying ‘mostly’ in her subtitle what we will be presented with here bears very little relation to the historical facts. However, not actually having read the book, it is not the subject of this brief post but rather the Guardian article. This article is crowned with the following headline:

Ada Lovelace and Charles Babbage designed a computer in the 1840s.

A cartoonist finishes the project.

 Can you spot the major howler in the very brief first sentence? Who designed a computer? Charles Babbage designed a computer. Ada Lovelace wrote a puff piece about that computer, which was in all probability largely ghost-written by Babbage. Just in case you should think that this was an inadvertent slip of a subeditor’s thumb on his computer keyboard the claim is repeated even more emphatically in the title of an illustration to the article.

200 years after Ada Lovelace’s birth, the Analytical Engine she designed with Charles Babbage is finally built, thanks to the imagination of Sydney Padua. Illustration: The Observer

In case you should still think that the writer of the piece could or should be excused of all blame, embarrassed by the hyperbolic flights of fancy of that technology history ignorant subeditor, we find the following in the main body of the article.

Brought up to shun the lure of poetry and revel instead in numbers, Lovelace teamed up with mathematician Charles Babbage who had grand plans for an adding machine, named the Difference Engine, and a computer called the Analytical Engine, for which Lovelace wrote the programs.

Where to begin? First off both the Difference Engine and the Analytical Engine are computers. The former a special purpose computer and the latter a general purpose one. Babbage would have been deeply offended having his mighty Difference Engine denigrated to a mere adding machine, although all computers are by name adding machines; computer coming, as it does, from the Latin computare which means to reckon/compute/calculate, sum/count (up). As a brief aside, when the word computer was coined in the 17th century it referred to a person employed to do calculations. Second, and in this context most important, Lovelace did not write the programs for the Analytical Engine. The afore mentioned puff piece from her pen contained one, note the singular, specimen program for the Analytical Engine, which she might possibly have written, although it seems more probable that Babbage wrote it. All the other programs for the Analytical Engine, and there were others were written by, you’ve guessed it, Charles Babbage.

The deification of Ada Lovelace marches on a pace with the honest historian of the computer barely able to keep pace with the waves of mythology that pour out of the unsavvy media almost every day it seems.


Filed under History of Computing, Myths of Science

Mega inanity

Since the lead up to the Turing centennial in 2012 celebrating the birth of one of the great meta-mathematicians of the twentieth century, Alan Mathison Turing, I have observed with increasing horror the escalating hagiographic accounts of Turing’s undoubted historical achievements and the resulting perversion of the histories of twentieth-century science, mathematics and technology and in particular the history of computing.

This abhorrence on my part is not based on a mere nodding acquaintance with Turing’s name but on a deep and long-time engagement with the man and his work. I served my apprenticeship as a historian of science over many years in a research project on the history of formal or mathematical logic. Formal logic is one of the so-called formal sciences the others being mathematics and informatics (or computer science). I have spent my whole life studying the history of mathematics with a special interest in the history of computing both in its abstract form and in its technological realisation in all sorts of calculating aids and machines. I also devoted a substantial part of my formal study of philosophy to the study of the philosophy of mathematics and the logical, meta-logical and meta-mathematical problems that this discipline, some would say unfortunately, generates. The history of all of these intellectual streams flow together in the first half of the twentieth century in the work of such people as Leopold Löwenheim, Thoralf Skolem, Emil Post, Alfred Tarski, Kurt Gödel, Alonso Church and Alan Turing amongst others. These people created a new discipline known as meta-mathematics whilst carrying out a programme delineated by David Hilbert.

Attempts to provide a solid foundation for mathematics using set theory and logic had run into serious problems with paradoxes. Hilbert thought the solution lay in developing each mathematical discipline as a strict axiomatic systems and then proving that each axiomatic system possessed a set of required characteristics thus ensuring the solidity and reliability of a given system. This concept of proving theories for complete axiomatic systems is the meta- of meta-mathematics. The properties that Hilbert required for his axiomatic systems were consistency, which means the systems should be shown to be free of contradictions, completeness, meaning that all of the theorems that belong to a particular discipline are deductible from its axiom system, and finally decidability, meaning that for any well-formed statement within the system it should be possible to produced an algorithmic process to decide if the statement is true within the axiomatic system or not. An algorithm is like a cookery recipe if you follow the steps correctly you will produce the right result.

The meta-mathematicians listed above showed by very ingenious methods that none of Hilbert’s aims could be fulfilled bringing the dream of a secure foundation for mathematics crashing to the ground. Turing’s solution to the problem of decidability is an ingenious thought experiment, for which he is justifiably regarded as one of the meta-mathematical gods of the twentieth century. It was this work that led to him being employed as a code breaker at Bletchley Park during WW II and eventually to the fame and disaster of the rest of his too short life.

Unfortunately the attempts to restore Turing’s reputation since the centenary of his birth in 2012 has led to some terrible misrepresentations of his work and its consequences. I thought we had reach a low point in the ebb and flow of the centenary celebrations but the release of “The Imitation Game”, the Alan Turing biopic, has produced a new series of false and inaccurate statements in the reviews. I was pleasantly pleased to see several reviews, which attempt to correct some of the worst historical errors in the film. You can read a collection of reviews of the film in the most recent edition of the weekly histories of science, technology and medicine links list Whewell’s Gazette. Not having seen the film yet I can’t comment but I was stunned when I read the following paragraph from the abc NEWS review of the film written by Alyssa Newcomb. It’s so bad you can only file it under; you can’t make this shit up.

The “Turing Machine” was the first modern computer to logically process information, running on interchangeable software and essentially laying the groundwork for every computing device we have today — from laptops to smartphones.

Before I analyse this train wreck of a historical statement I would just like to emphasise that this is not the Little Piddlington School Gazette, whose enthusiastic but slightly slapdash twelve-year-old film critic got his facts a little mixed up, but a review that appeared on the website of a major American media company and as such totally unacceptable however you view it.

The first compound statement contains a double whammy of mega-inane falsehood and I had real problems deciding where to begin and finally plumped for the “first modern computer to logically process information, running on interchangeable software”. Alan Turing had nothing to do with the first such machine, the honour going to Konrad Zuse’s Z3, which Zuse completed in 1941. The first such machine in whose design and construction Alan Turing was involved was the ACE produced at the National Physical Laboratory, in London, in 1949. In the intervening years Atanasoff and Berry, Tommy Flowers, Howard Aikin, as well as Eckert and Mauchly had all designed and constructed computers of various types and abilities. To credit Turing with the sole responsibility for our digital computer age is not only historically inaccurate but also highly insulting to all the others who made substantial and important contributions to the evolution of the computer. Many, many more than I’ve named here.

We now turn to the second error contained in this wonderfully inane opening statement and return to the subject of meta-mathematics. The “Turing Machine” is not a computer at all its Alan Turing’s truly genial thought experiment solution to Hilbert’s decidability problem. Turing imagined a very simple machine that consists of a scanning-reading head and an infinite tape that runs under the scanning head. The head can read instructions on the tape and execute them, moving the tape right or left or doing nothing. The question then reduces to the question, which set of instructions on the tape come eventually to a stop (decidable) and which lead to an infinite loop (undecidable). Turing developed this idea to a machine capable of computing any computable function (a universal Turing Machine) and thus created a theoretical model for all computers. This is of course a long way from a practical, real mechanical realisation i.e. a computer but it does provide a theoretical measure with which to describe the capabilities of a mechanical computing device. A computer that is the equivalent of a Universal Turing Machine is called Turing complete. For example, Zuse’s Z3 was Turing complete whereas Colossus, the computer designed and constructed by Tommy Flowers for decoding work at Bletchley Park, was not.

Turing’s work played and continues to play an important role in the theory of computation but historically had very little effect on the development of real computers. Attributing the digital computer age to Turing and his work is not just historically wrong but is as I already stated above highly insulting to all of those who really did bring about that age. Turing is a fascinating, brilliant, and because of what happened to him because of the persecution of homosexuals, tragic figure in the histories of mathematics, logic and computing in the twentieth century but attributing achievements to him that he didn’t make does not honour his memory, which certainly should be honoured, but ridicules it.

I should in fairness to the author of the film review, that I took as motivation from this post, say that she seems to be channelling misinformation from the film distributors as I’ve read very similar stupid claims in other previews and reviews of the film.


Filed under History of Computing, History of Logic, History of Mathematics, Myths of Science

Oh please!

The latest move in the canonisation of Alan Turing is an opera, or whatever, written by the Pet Shop Boys, which is being heavily promoted by a PR campaign launched yesterday. As part of this press onslaught this magazine cover appeared on my Twitter stream today.


For the record, as a fan and one time student of meta-mathematics I was aware of and to some extent in awe of Alan Turing long before most of the people now trying to elevate him into Olympus even knew he existed. He was without a shadow of a doubt one of the most brilliant logicians of the twentieth-century and he along with others of his ilk, such as Leopold Löwenheim, Thoralf Skolem, Emil Post, Kurt Gödel, Alonzo Church etc. etc., who laid the theoretical foundations for much of the computer age, all deserve to be much better known than they are, however the attempts to adulate Turing’s memory have become grotesque. The Gay Man Who Saved the World is hyperbolic, hagiographic bullshit!

Turing made significant contributions to the work of Bletchley Park in breaking various German codes during the Second World War. He was one of nine thousand people who worked there. He did not work in isolation; he led a team that cracked one version of the Enigma Code. To what extent the work of Bletchley Park contributed to the eventual Allied victory is probably almost impossible to assess or quantify.

Alan Turing made significant contributions to the theories of meta-mathematics and an equally significant contribution to the British war effort. He did not, as is frequently claimed by the claqueur, invent the computer and he most certainly did not “save the world”. Can we please return to sanity in our assessment of our scientific heroes?


Filed under History of Computing, History of Mathematics, Myths of Science

Sliding to mathematical fame.

William Oughtred born on the 5th March 1575, who Newton regarded along with Christopher Wren and John Wallis as one of the three best seventeenth-century English mathematicians, was the epitome of the so-called English School of Mathematics. The English School of Mathematics is a loose historical grouping of English mathematicians stretching over several generations in the sixteenth and seventeenth centuries who propagated and supported the spread of mathematics, mostly in the vernacular, through teaching and writing at a time when the established educational institutions, schools and universities, offered little in the way of mathematical tuition. These men taught each other, learnt from each other, corresponded with each other, advertised each other in their works, borrowed from each other and occasionally stole from each other building an English language mathematical community that stretched from Robert Recorde (c. 1512 – 1558) who is regarded as its founder to Isaac Newton at the close of the seventeenth century who can be regarded as a quasi member.  Oughtred who died in 1660 spanned the middle of this period and can be considered to be one of its most influential members.

Oughtred was born at Eton College where his father Benjamin was a writing master and registrar and baptised there on 5th March 1575, which is reputedly also his birthdate. He was educated at Eton College and at King’s College Cambridge where he graduated BA in 1596 and MA in 1600. It was at Cambridge that he says he first developed his interest for mathematics having been taught arithmetic by his father.  Whilst still at Cambridge he also started what was to become his vocation, teaching others mathematics.  He was ordained priest in 1603 and appointed vicar of Shalford in Surry. In 1610 he was appointed rector of nearby Albury where he remained for the rest of his life. He married Christgift Caryll in 1606, who bore him twelve or possibly thirteen children, accounts differ. All in all Oughtred lived the life of a simple country parson and would have remained unknown to history if it had not been for his love of mathematics.

William Oughtred by Wenceslas Hollar 1646

William Oughtred
by Wenceslas Hollar 1646

Oughtred’s first claim to fame as a mathematician was as a pedagogue. He worked as a private tutor and also wrote and published one of the most influential algebra textbooks of the century his Clavis Mathematicae first published in Latin in 1631. This was a very compact introduction to symbolic algebra and was one of the first such books to be written almost exclusively in symbols, several of which Oughtred was the first to use and which are still in use today. Further Latin edition appeared in 1648, 1652, 1667 and 1698 with an English translation appearing in 1647 under the title The Key to Mathematics.

The later editions were produced by a group of Oxford mathematicians that included Christopher Wren, Seth Ward and John Wallis. Seth Ward lived and studied with Oughtred for six months and Wallis, Wren and Jonas Moore all regarded themselves as disciples, although whether they studied directly with Oughtred is not known. Wallis probably didn’t but claimed to have taught himself maths using the Clavis.

Title page Clavis Mathematicae 5th ed 1698  Ed John Wallis

Title page Clavis Mathematicae 5th ed 1698
Ed John Wallis

The Latin editions of the Clavis were read throughout Europe and Oughtred enjoyed a very widespread and very high reputation as a mathematician.

Although he always preached the importance of theory before application Oughtred also enjoyed a very high reputation as the inventor of mathematical instruments and it is for his invention of the slide rule that he is best remembered today. The international society for slide rule collectors is known as the Oughtred Society. I realise that in this age of the computer, the tablet, the smart phone and the pocket calculator there is a strong chance that somebody reading this won’t have the faintest idea what a slide rule is. I’m not going to explain although I will outline the historical route to the invention of the slide rule but will refer those interested to this website.

The Scottish mathematician John Napier and the Swiss clock and instrument maker Jobst Bürgi both invented logarithms independently of each other at the beginning of the seventeenth century although Napier published first in 1614. The basic idea had been floating around for sometime and could be found in the work of the Frenchman Nicolas Chuquet in the fifteenth century and the German Michael Stifel in the sixteenth. In other words it was an invention waiting to happen. Napier’s logarithms were base ‘e’ now called natural logarithms (that’s the ln key on your pocket calculator) and the English mathematician Henry Briggs (1561 – 1630), Gresham Professor of Geometry, thought it would be cool to have logarithms base 10 (that’s the log key on your pocket calculator), which he published in 1620. Edmund Gunter (1581 – 1626), Gresham Professor of Astronomy, who was very interested in cartography and navigation, produced a logarithmic scale on a ruler, known, not surprisingly, as the Gunter Scale or Rule, which could be read off using a pair of dividers to enable navigators to make rapid calculations on sea charts.

Briggs introduced his good friend Oughtred to Gunter, remember that bit above about teaching, learning etc. from each other, and it was Oughtred who came up with the idea of placing two Gunter Scales next to each other to facilitate calculation by sliding the one scale up and down against the other and thus the slide rule was born. Oughtred first published his invention in a pamphlet entitled The Circles of Proportion and the Horizontal Instrument in 1631, which actually describes an improved circular slide rule with the scales now on circular discs rotating about a central pin. This publication led to a very nasty dispute with Richard Delamain, a former pupil of Oughtred’s who claimed that he had invented the slide rule and not his former teacher. This led to one of those splendid pamphlet priority wars with both antagonists pouring invective over each other by the bucket load. Oughtred won the day both in his own time and in the opinion of the historians and is universally acknowledged as the inventor of the slide rule, which became the trusty companion of all applied mathematicians, engineers and physicist down the centuries. Even when I was at secondary school in the 1960s you would never see a physicist without his trusty slide rule.

It still seems strange to me that more than a whole generation has grown up with no idea what a slide rule is or what it could be used for and that Oughtred’s main claim to fame is slowly but surely sliding into the abyss of forgetfulness.


Filed under History of Computing, History of Mathematics, Renaissance Science

5 Brilliant Mathematicians – 4 Crappy Commentaries

I still tend to call myself a historian of mathematics although my historical interests have long since expanded to include a much wider field of science and technology, in fact I have recently been considering just calling myself a historian to avoid being pushed into a ghetto by those who don’t take the history of science seriously. Whatever, I have never lost my initial love for the history of mathematics and will automatically follow any link offering some of the same. So it was that I arrived on the Mother Nature Network and a blog post titled 5 brilliant mathematicians and their impact on the modern world. The author, Shea Gunther, had actually chosen 5 brilliant mathematicians with Isaac Newton, Carl Gauss, John von Neumann, Alan Turing and Benoit Mandelbrot and had even managed to avoid the temptation of calling them ‘the greatest’ or something similar. However a closer examination of his commentaries on his chosen subjects reveals some pretty dodgy not to say down right crappy claims, which I shall now correct in my usual restrained style.

He starts of fairly well on Newton with the following:

There aren’t many subjects that Newton didn’t have a huge impact in — he was one of the inventors of calculus, built the first reflecting telescope and helped establish the field of classical mechanics with his seminal work, “Philosophiæ Naturalis Principia Mathematica.” He was the first to decompose white light into its constituent colors and gave us, the three laws of motion, now known as Newton’s laws.

But then blows it completely with his closing paragraph:

We would live in a very different world had Sir Isaac Newton not been born. Other scientists would probably have worked out most of his ideas eventually, but there is no telling how long it would have taken and how far behind we might have fallen from our current technological trajectory.

This is the type of hagiographical claim that fans of great scientists tend to make who have no real idea of the context in which their hero worked. Let’s examine step by step each of the achievements of Newton listed here and see if the claim made in this final paragraph actually holds up.

Ignoring the problems inherent in the claim that Newton invented calculus, which I’ve discussed here, the author acknowledges that Newton was only co-inventor together with Leibniz and although Newton almost certainly developed his system first it was Leibniz who published first and it was his system that spread throughout Europe and eventually the world so no changes here if Isaac had not been born.

Newton did indeed construct the first functioning reflecting telescope but as I explained here it was by no means the first. It would also be fifty years before John Hadley succeeded in repeating Newton’s feat and finally making the commercial production of reflecting telescopes viable. However Hadley also succeeded in making working models of James Gregory’s reflecting telescope, which actually predated Newton’s and it was the Gregorian that, principally in the hands of James Short, became the dominant model in the eighteenth century. Although to be fair one should mention that William Herschel made his discoveries with Newtonians. Once again our author’s claim fails to hold water.

Sticking with optics for the moment it is a little know and even less acknowledge fact that the Bohemian physicus and mathematician Jan Marek Marci (1595 – 1667) actually decomposed white light into its constituent colours before Newton. Remaining for a time with optics, James Gregory, Francesco Maria Grimaldi, Christian Huygens and Robert Hooke were all on a level with Newton although none of them wrote such an influential book as Newton’s Optics on the subject. Now this was not all positive. Due to the influence won through the Principia, The Optics became all dominant preventing the introduction of the wave theory of light developed by Huygens and Hooke and even slowing down its acceptance in the nineteenth century when proposed by Fresnel and Young. If Newton hadn’t been born optics might even have developed and advance more quickly than it did.

This just leaves the field of classical mechanics Newton real scientific monument. Now, as I’ve pointed out several times before the three laws of motion were all borrowed by Newton from others and the inverse square law of gravity was general public property in the second half of the seventeenth century. Newton’s true genius lay in his mathematical combination of the various elements to create a whole. Now the question is how quickly might this synthesis come about had Newton never lived. Both Huygens and Leibniz had made substantial contribution to mechanics contemporaneously with Newton and the succeeding generation of French and Swiss-German mathematicians created a synthesis of Newton’s, Leibniz’s and Huygens’ work and it is this that is what we know as the field of classical mechanics. Without Newton’s undoubtedly massive contribution this synthesis might have taken a little longer to come into being but I don’t think the delay would have radically changed the world in which we live.

Like almost all great scientists Newton’s discoveries were of their time and he was only a fraction ahead of and sometimes even behind his rivals. His non-existence would probably not have had that much impact on the development of history.

Moving on to Gauss we will have other problems. Our author again makes a good start:

Isaac Newton is a hard act to follow, but if anyone can pull it off, it’s Carl Gauss. If Newton is considered the greatest scientist of all time, Gauss could easily be called the greatest mathematician ever.

Very hyperbolic and hagiographic but if anybody could be called the greatest mathematician ever then Gauss would be a serious candidate. However in the next paragraph we go off the rails. The paragraph starts OK:

Carl Friedrich Gauss was born to a poor family in Germany in 1777 and quickly showed himself to be a brilliant mathematician. He published “Arithmetical Investigations,” a foundational textbook that laid out the tenets of number theory (the study of whole numbers).

So far so good but then our author demonstrates his lack of knowledge of the subject on a grand scale:

Without number theory, you could kiss computers goodbye. Computers operate, on a the most basic level, using just two digits — 1 and 0

Here we have gone over to the binary number system, with which Gauss book on number theory has nothing to do, what so ever. In modern European mathematics the binary number system was first investigated in depth by Gottfried Leibniz in 1679 more than one hundred years before Gauss wrote his Disquisitiones Arithmeticae, which as already stated has nothing on the subject. The use of the binary number system in computing is an application of the two valued symbolic logic of George Boole the 1 and 0 standing for true and false in programing and on and off in circuit design. All of which has nothing to do with Gauss. Gauss made so many epochal contributions to mathematics, physics, cartography, surveying and god knows what else so why credit him with something he didn’t do?

Moving on to John von Neumann we again have a case of credit being given where credit is not due but to be fair to our author, this time he is probably not to blame for this misattribution.  Our author ends his von Neumann description as follows:

Before his death in 1957, von Neumann made important discoveries in set theory, geometry, quantum mechanics, game theory, statistics, computer science and was a vital member of the Manhattan Project.

This paragraph is fine and if Shea Gunther had chosen to feature von Neumann’s invention of game theory or three valued quantum logic I would have said fine, praised the writer for his knowledge and moved on without comment. However instead our author dishes up one of the biggest myths in the history of the computer.

he went on to design the architecture underlying nearly every single computer built on the planet today. Right now, whatever device or computer that you are reading this on, be it phone or computer, is cycling through a series of basic steps billions of times over each second; steps that allow it to do things like render Internet articles and play videos and music, steps that were first thought up by John von Neumann.

Now any standard computer is called a von Neumann machine in terms of its architecture because of a paper that von Neumann published in 1945, First Draft of a Report on the EDVAC. This paper described the architecture of the EDVAC one of the earliest stored memory computers but von Neumann was not responsible for the design, the team led by Eckert and Mauchly were. Von Neumann had merely described and analysed the architecture. His publication caused massive problems for the design team because the information now being in the public realm it meant that they were no longer able to patent their innovations. Also von Neumann’s name as author on the report meant that people, including our author, falsely believed that he had designed the EDVAC. Of historical interest is the fact that Charles Babbage’s Analytical Engine in the nineteenth century already possessed von Neumann architecture!

Unsurprisingly we walk straight into another couple of history of the computer myths when we turn to Alan Turing.  We start with the Enigma story:

During World War II, Turing bent his brain to the problem of breaking Nazi crypto-code and was the one to finally unravel messages protected by the infamous Enigma machine.

There were various versions of the Enigma machine and various codes used by different branches of the German armed forces. The Polish Cipher Bureau were the first to break an Enigma code in 1932. Various other forms of the Enigma codes were broken by various teams at Bletchley Park without Turing. Turing was responsible for cracking the German Naval Enigma. The statement above denies credit to the Polish Cipher Bureau and the other 9000 workers in Bletchley Park for their contributions to encoding Enigma.

Besides helping to stop Nazi Germany from achieving world domination, Alan Turing was instrumental in the development of the modern day computer. His design for a so-called “Turing machine” remains central to how computers operate today.

I’ve lost count of how many times that I’ve seen variations on the claim in the above paragraph in the last eighteen months or so, all equally incorrect. What such comments demonstrate is that their authors actually have no idea what a Turing machine is or how it relates to computer design.

In 1936 Alan Turing, a mathematician, published a paper entitled On Computable Numbers, with an Application to the Entscheidungsproblem. This was in fact one of four contemporaneous solutions offered to a problem in meta-mathematics first broached by David Hilbert, the Entscheidungsproblem. The other solutions, which needn’t concern us here, apart from the fact that Post’s solution is strongly similar to Turing’s, were from Kurt Gödel, Alonso Church and Emil Post. Entscheidung is the German for decision and the Entscheidungsproblem asks if for a given axiomatic system whether it is also possible with the help of an algorithm to decide if a given statement in that axiom system is true or false. The straightforward answer that all four men arrived at by different strategies is that it isn’t. There will always be undecidable statements within any sufficiently complex axiomatic system.

Turing’s solution to the Entscheidungsproblem is simple, elegant and ingenious. He hypothesised a very simple machine that was capable of reading a potentially infinite tape and following instruction encoded on that tape. Instruction that moved the tape either right or left or simply stopped the whole process. Through this analogy Turing was able to show that within an axiomatic system some problems would never be Entscheidbar or in English decidable. What Turing’s work does is, on a very abstract level, to delineate the maximum computability of any automated calculating system. Only much later, in the 1950s, after the invention of electronic computers a process in which Turing also played a role did it occur to people to describe the computational abilities of real computers with the expression ‘Turing machine’.  A Turing machine is not a design for a computer it is term used to described the capabilities of a computer.

To be quite open and honest I don’t know enough about Benoit Mandelbrot and fractals to be able to say whether our author at least got that one right, so I’m going to cut him some slack and assume that he did. If he didn’t I hope somebody who knows more about the subject that I will provide the necessary corrections in the comments.

All of the errors listed above are errors that could have been easily avoided if the author of the article had cared in anyway about historical accuracy and truth. However as is all to often the case in the history of science or in this case mathematics people are prepared to dish up a collection of half baked myths, misconceptions and not to put too fine a point on it crap and think they are performing some sort of public service in doing so. Sometimes I despair.



Filed under History of Computing, History of Logic, History of Mathematics, History of Optics, History of Physics, History of science, Myths of Science, Newton