The end of this year sees a double English bicentennial in the history of computing. On 2 November we celebrate the two hundredth anniversary of the birth of mathematician and logician Georg Boole then on 10 December the two hundredth anniversary of the birth of ‘science writer’ Augusta Ada King, Countess of Lovelace. It is an interesting exercise to take a brief look at how these two bicentennials are being perceived in the public sphere.

As I have pointed out in several earlier posts Ada was a member of the minor aristocracy, who, although she never knew her father, had a wealthy well connected mother. She had access to the highest social and intellectual circles of early Victorian London. Despite being mentored and tutored by the best that London had to offer she failed totally in mastering more than elementary mathematics. So, as I have also pointed out more than once, to call her a mathematician is a very poor quality joke. Her only ‘scientific’ contribution was to translate a memoire on Babbage’s Analytical Engine from French into English to which are appended a series of new notes. There is very substantial internal and external evidence that these notes in fact stem from Babbage and not Ada and that she only gave them linguistic form. What we have here is basically a journalistic interview and not a piece of original work. It is a historical fact that she did not write the first computer programme, as is still repeated ad nauseam every time her name is mentioned.

However the acolytes of the Cult of the Holy Saint Ada are banging the advertising drum for her bicentennial on a level comparable to that accorded to Einstein for the centenary of the General Theory of Relativity. On social media ‘Finding Ada’ are obviously planning massive celebrations, which they have already indicated although the exact nature of them has yet to be revealed. More worrying is the publication of the graphic novel *The Thrilling Adventures of Lovelace and Babbage: The (Mostly) True Story of the First Computer* (note who gets first billing!) by animator and cartoonist Sydney Padua. The Analytical Engine as of course not the first computer that honour goes to Babbage’s Difference Engine. More important Padua’s novel is not even remotely ‘mostly’ true but largely fictional. This wouldn’t matter that much if said book had not received major media attention. Attention that compounded the error by conveniently forgetting the mostly. The biggest lie in the work of fiction is the claim that Ada was somehow directly involved in the conception and construction of the Analytical engine. In reality she had absolutely nothing to do with either its conception or its construction.

This deliberate misconception has been compounded by a, in social media widely disseminated, attempt to get support for a Lovelace, Babbage Analytical Engine Lego Set. The promoter of this enterprise has written in his blurb:

*Ada Lovelace (1815-1852) is widely credited as the first computer scientist and Charles Babbage (1791-1871) is best remembered for originating the concept of a programmable computer. Together they collaborated on Babbage’s early mechanical general-purpose computer, the Analytical Engine.*

Widely credited by whom? If anybody is the first computer scientist in this set up then it’s Babbage. Others such as Leibniz speculated on what we now call computer science long before Ada was born so I think that is another piece of hype that we can commit to the trashcan. Much more important is the fact that they did not collaborate on the Analytical Engine that was solely Babbage’s baby. This factually false hype is compounded in the following tweet from 21 July, which linked to the Lego promotion:

*Historical lego* [sic]* of Ada Lovelace’s conception of the first programmable computer*

To give some perspective to the whole issue it is instructive to ask about what in German is called the ‘Wirkungsgeschichte’, best translated as historical impact, of Babbage’s efforts to promote and build his computers, including the, in the mean time, notorious Menabrea memoire, irrespective as to who actually formulated the added notes. The impact of all of Babbage’s computer endeavours on the history of the computer is almost nothing. I say almost because, due to Turing, the notes did play a minor role in the early phases of the post World War II artificial intelligence debate. However one could get the impression from the efforts of the Ada Lovelace fan club, strongly supported by the media that this was a highly significant contribution to the history of computing that deserves to be massively celebrated on the Lovelace bicentennial.

Let us now turn our attention to subject of our other bicentennial celebration, George Boole. Born into a working class family in Lincoln, Boole had little formal education. However his father was a self-educated man with a thirst for knowledge, who instilled the same characteristics in his son. With some assistance he taught himself Latin and Greek and later French, German and Italian in order to be able to read the advanced continental mathematics. His father went bankrupt when he was 16 and he became breadwinner for the family, taking a post as schoolmaster in a small private school. When he was 19 he set up his own small school. Using the library of the local Mechanics Institute he taught himself mathematics. In the 1840s he began to publish original mathematical research in the Cambridge Mathematical Journal with the support of Duncan Gregory, a great great grandson of Newton’s contemporary James Gregory. Boole went on to become one of the leading British mathematicians of the nineteenth century and despite his total lack of formal qualifications he was appointed Professor of Mathematics at the newly founded Queen’s College of Cork in 1849.

Although a fascinating figure in the history of mathematics it is Boole the logician, who interests us here. In 1847 Boole published the first version of his logical algebra in the form of a largish pamphlet, *Mathematical Analysis of Logic*. This was followed in 1854 by an expanded version of his ideas in his *An Investigation of the Laws of Thought, on which are founded the Mathematical Theories of Logic and Probability*. These publications contain the core of Boolean algebra, the final Boolean algebra was actually produced by Stanley Jevons, only the second non-standard algebra ever to be developed. The first non-standard algebra was Hamilton’s quaternions. For non-mathematical readers standard algebra is the stuff we all learned (and loved!) at school. Boolean algebra was Boole’s greatest contribution to the histories of mathematics, logic and science.

When it first appeared Boole’s logic was large ignored as an irrelevance but as the nineteenth century progressed it was taken up and developed by others, most notably by the German mathematician Ernst Schröder, and provided the tool for much early work in mathematical logic. Around 1930 it was superseded in this area by the mathematical logic of Whitehead’s and Russell’s *Principia Mathematica*. Boole’s algebraic logic seemed destined for the novelty scrap heap of history until a brilliant young American mathematician wrote his master’s thesis.

Claude Shannon (1916–2001) was a postgrad student of electrical engineering of Vannevar Bush at MIT working on Bush’s electro-mechanical computer the differential analyzer. Having learnt Boolean algebra as an undergraduate Shannon realised that it could be used for the systematic and logical design of electrical switching circuits. In 1937 he published a paper drawn from his master’s thesis, *A Symbolic Analysis of Relay and Switching Circuits.* Shannon switching algebra, applied Boolean algebra, would go on to supply the basis of the hardware design of all modern computers. When people began to write programs for the computers designed with Shannon’s switching algebra it was only natural that they would use Boole’s two-valued (1/0, true/false, on/off) algebra to write those programs. Almost all modern computers are both in their hardware and there software applied Boolean algebra. One can argue, as I have actually done somewhat tongue in cheek in a lecture, that George Boole is the ‘father’ of the modern computer. (Somewhat tongue in cheek, as I don’t actually like the term ‘father of’). The modern computer has of course many fathers and mothers.

In George Boole, as opposed to Babbage and Lovelace, we have a man whose work made a massive real contribution to history of the computer and although both the Universities of Cork and Lincoln are planning major celebration for his bicentennial they have been, up till now largely ignored by the media with the exception of the Irish newspapers who are happy to claim Boole, an Englishman, as one of their own.

The press seems to have decided that a ‘disadvantaged’ (she never was, as opposed to Boole) female ‘scientist’, who just happens to be Byron’s daughter is more newsworthy in the history of the computer than a male mathematician, even if she contributed almost nothing and he contributed very much.