Category Archives: History of Computing

Christmas Trilogy 2015 Part 3: Roll out the barrel.

The village master taught his little school

The village all declared how much he knew,

‘Twas certain he could write, and cipher too;

Lands he could measure, times and tides presage,

And e’en the story ran that he could gauge

Oliver Goldsmith – The Deserted Village

As I have commented on a number of occasions in the past, although most people only know Johannes Kepler, if they have heard of him at all, as the creator of his eponymous three laws of planetary motion in fact he published more than eighty books and pamphlets in his life covering a very wide range of scientific and mathematical subjects. One of those publications, which often brings a smile to the faces of those not aware of its mathematical significance, is his Nova stereometria doliorum vinariorum (which translates as The New Art of Measuring the Contents of Wine Barrels) published in 1615. A whole book devoted to determining the volume of wine barrels! Surely not a suitable subject for a man who determined the laws of the cosmos and helped lay the foundations of modern optics, had the good Johannes taken to drink in the face of his personal problems?

Title page of Kepler's 1615 Nova stereometria doliorum vinariorum (image used by permission of the Carnegie Mellon University Libraries)

Title page of Kepler’s 1615 Nova stereometria doliorum vinariorum (image used by permission of the Carnegie Mellon University Libraries)

Because he is now regarded as one of the earliest ‘modern’ mathematicians people tend to forget that Kepler lived not in the age of the mathematician but in that of the mathematical practitioner. This means that as district mathematician in Graz, and later in Linz, Kepler would have been expected to carry out a large range of practical mathematical tasks including surveying, cartography, dialling (that is the design and construction of sundials), writing astrological prognostica, almanacs and calendars and gauging amongst others. We know that Kepler carried out a lot of these tasks but as far as I know he was never employed as a gauger, that is a man responsible for measuring and/or calculating the volume of barrels and their contents.

Nowadays with the wooden barrel degraded to the role of garden ornament in the forecourts of kitschy country pubs it is hard for people to imagine that for more than half a millennium the art of gauging and the profession of the gauger were a widespread and important part of the political and business life of Europe. Wooden barrels first made their appearance during the iron age, that is sometime during the first millennium BCE, iron making it possible to make tools with which craftsmen could work and shape the hard woods used to make barrels. It seems that we owe the invention of the barrel to the Celtic peoples of Northern Europe, who were making wooden barrels at least as early as five hundred BCE, although wooden buckets go back much earlier, with the earliest known one being from Egypt, 2690 BCE. The early wooden buckets were carved from single blocks of wood unlike barrels that are made from staves assembled and held together with hoops of saplings, rope or iron.

Source: Wood, Whiskey and Wine: A History of Barrels by Henry H. Work

Source: Wood, Whiskey and Wine: A History of Barrels by Henry H. Work

The ancient Greeks and Romans used large clay vessels called amphora to transport goods, in particular liquids such a wine and oil.

Roman Amphorae Source: Wikimedia Commons

Roman Amphorae
Source: Wikimedia Commons

However by about two hundred to three hundred CE the Romans, to whom we owe our written knowledge (supported by archaeological finds) of the Celtic origins of barrel making, were transporting wine in barrels. Wooden barrels appear to be a uniquely European invention appearing first in other parts of the world when introduced by Europeans.

By the Middle Ages wooden barrels had become ubiquitous throughout Europe used for transporting and storing a bewildering range of both dry and wet goods including books and corpses, the latter conserved in alcohol. With the vast increase in trade, both national and international, came the problem of taxes and custom duties on borders or at town gates. Wine, beer and spirits were taxed according to volume and the tax officials were faced with the problem of determining the volumes of the diverse barrels that poured daily across borders or through town gates, enter the gauger and the gauging rod.

Gauger with gauging rod Source:

Gauger with gauging rod
Source:

The simplest method of determining the volume of liquid contained in a barrel would be to pour out contents into a measuring vessel. This was of course not a viable choice for tax or customs official, so something else had to be done. Because of its shape determining the volume of a barrel-shaped container is not a simple geometrical exercise like that of determining the volume of a cylinder, sphere or cube so the mathematicians had to find another way. The solution was a gauging rod. This is a rod marked with a scale that was inserted diagonally into the barrel through the bung hole and by reading off the number on the scale the gauger could then calculate a good approximation of the volume of fluid in the barrel and then calculate the tax or custom’s duty due. From some time in the High Middle ages through to the nineteenth century gaugers and their gauging rods and gauging slide rules were a standard part of the European trade landscape.

A gauging slide rule Source

A gauging slide rule
Source

The mathematical literature on the art of gauging, particularly from the Early Modern Period is vast. As a small side note Antonie van Leeuwenhoek, the famous seventeenth-century microscopist, also worked for a time as gauger for the City of Delft.

A Cooper Jan Luyken Source

A Cooper Jan Luyken
Source

However after this brief excursion into the history of barrels and barrel gauging it is time to turn attention back to Kepler and his Nova stereometria doliorum vinarioru. In 1613, now living in Linz, Kepler purchased some barrels to lay in a supply of wine for his family. The wine dealer filled the casks and proceeded to measure the volume they contained using a gauging rod. Kepler being a notoriously exacting mathematician was horrified by the inaccuracy of this method of measurement and set about immediately to see if he produce a better mathematical method of determining the volume of barrels. Returning to the Eudoxian/Archimedian method of exhaustion that he had utilized to determine his second law of planetary motion he presented the volume of the barrel as the sum of a potentially infinite sum of a series of slices through the barrels. In modern terminology he used integral calculus to determine the volume. Never content to do half a job Kepler extended his mathematical investigations to determining the volumes of a wide range of three-dimensional containers and his efforts developed into a substantial book. Because he lacked the necessary notions of limits and convergence when summing infinite series, Kepler’s efforts lack mathematical rigour, as had his determination of his second law, a fact that Kepler was more than aware of. However, as with his second law he was prepared to sacrifice rigour for a practical functioning solution and to leave it to prosperity posterity to clean up the mess.

Having devoted so much time and effort to the task Kepler decided to publish his studies and immediately ran into new problems. There was at the time no printer/publisher in Linz so Kepler was forced to send his manuscript to Markus Welser, rich trader and science patron from Augsburg, who initiated the sunspot dispute between Galileo and Christoph Scheiner, to get his book published there. Unfortunately none of the printer/publishers in Augsburg were prepared to take on the risk of publishing the book and when Welser died in 1614 Kepler had to retrieve his manuscript and make other arrangements. In 1615 he fetched the printer Johannes Plank from Erfurt to Linz and paid him to print the book at his own cost. Unfortunately it proved to be anything but a best seller leaving Kepler with a loss on his efforts. In order to make his new discoveries available to a wider audience Kepler edited a very much simplified German edition in the same year under the title Ausszug auss der Vralten Messkunst Archimedis (Excerpts from the ancient art of mensuration by Archimedes). This book is important in the history of mathematics for provided the first German translations of numerous Greek and Latin mathematical terms. Plank remained in Linz and became Kepler’s house publisher during his time there.

Ausszug auss der Vralten Messkunst Archimedis title page Source

Ausszug auss der Vralten Messkunst Archimedis title page
Source

Although not one of his most successful works Kepler’s Nova stereometria doliorum is historically important for two different reasons. It was the first book to present a systematic study of the volumes of barrels based on geometrical principles and it also plays an important role in the history of infinitesimal calculus.

 

6 Comments

Filed under History of Computing, History of Mathematics

Christmas Trilogy 2015 Part 2: Understanding the Analytical Engine.

The Acolytes of the Holy Church of Saint Ada still persist in calling her a brilliant mathematician and the ‘first computer programmer’ despite the fact that both are provably wrong. In fact they have now moved into the realm of denialists, similar to evolution or climate denialists, in that they accuse people like myself who point to the historical facts of being male chauvinists who are trying to deny women their rights in the history of science! However the acolytes have gone a step further in the adulation of Lady King in that they now claim that she understood the Analytical Engine better than Babbage! Confronted by this patently ridiculous claim I’m not sure whether to laugh or cry. Babbage conceived, designed and attempted to construct parts of the Analytical Engine whereas Ada Lovelace merely wrote an essay about it based on her exchanges with Babbage on the subject, to suggest that she understood the machine better than its sole creator borders on the insane. I cannot be certain who first set this bizarre claim in the world as nearly all of those who repeat it give neither justification or source for their utterances but the most often quoted in this context is James Essinger and his biography of Ada, which appears to enjoy several different titles[1].

Trial model of a part of the Analytical Engine, built by Babbage, as displayed at the Science Museum (London). Source: Wikimedia Commons

Trial model of a part of the Analytical Engine, built by Babbage, as displayed at the Science Museum (London).
Source: Wikimedia Commons

Before going into detail it should be pointed out the Essinger’s book, which is popular rather then academic and thus lacks sources for many of his claims, suffers from two fundamental flaws. Like much pro Ada writing it doesn’t delve deep enough into the live and work of Charles Babbage. This type of writing tends to treat Babbage as an extra in the film of Ada’s life, whereas in reality in relation to the Analytical Engine it is Ada who is a minor character in Babbage’s life. Also Essinger writes about the translation of the Menabrea essay on the Analytical engine as if the appended notes were exclusively the product of Ada’s brain, whereas it is an established fact from the correspondence that they were very much a co-production between Babbage and Lovelace based on many exchanges both in personal conversations and in that correspondence. This means that in basing any argument on any idea contained in those notes the writer has the job of determining, which of the two would be the more probable source of that idea and not simply blindly attribute it to Ada. As we shall see Essinger’s failure to do this leads to a major flaw in his central argument that Ada understood the Analytical Engine better than Babbage.

Essinger’s approach is two pronged. On the one side he claims that Babbage didn’t understand the future potential of the machine that he, and he alone, conceived and created (on paper at least) and on the other he proposes on the basis of his interpretation of Note A of the essay that Ada, whom he assumes to be the originator of the thoughts this not contains, had a vision of the Analytical engine equivalent to modern computer science. As we shall see Essinger is mistaken on both counts.

Whilst offering absolutely no source for his claim, Essinger states time and again throughout his book that Babbage only every conceived of the Analytical Engine as a device for doing mathematics, a super number cruncher so to speak. If Essinger had taken the trouble to elucidate the origins of Babbage’s inspiration for the Analytical Engine he would know that he is seriously mistaken in his view, although in one sense he was right in thinking that Babbage concentrated on the mathematical aspects of the Engine but for reasons that Essinger doesn’t consider anywhere in his book.

Babbage lived in the middle of the Industrial Revolution and was fascinated by mechanisation and automation throughout his entire life. During the 1820s Babbage travelled throughout the British Isles visiting all sorts of industrial plant to study and analyse their uses of mechanisation and automation. In 1827 his wife, Georgiana, died and Babbage who had married against the opposition of his father out of love was grief stricken. Leaving Britain to escape the scene of his sorrow Babbage, by now having inherited his fathers fortune a rich man, spent many months touring the continent carrying out the same survey of the industrial advances in mechanisation and automation wherever his wanderings took him. It was on this journey that he first learnt of the automated Jacquard loom that would supply him with the idea of programming the Analytical Engine with punch cards. Returning to Britain Babbage now turned all those years of research into a book, On the Economy of Machinery and Manufactures published in 1832, that is a year before he met Ada Lovelace for the first time and ten years before Menabrea essay was written. The book was a massive success going through six editions in quick succession and influencing the work of Karl Marx and John Stuart Mill amongst others. It would be safe to say that in 1832 Babbage knew more about mechanisation and automation that almost anybody else on the entire planet and what it was capable of doing and which activities could be mechanised and/or automated. It was in this situation that Babbage decided to transfer his main interest from the Difference Engine to developing the concept of the Analytical Engine conceived from the very beginning as a general-purpose computer capable of carrying out everything that could be accomplished by such a machine, far more than just a super number cruncher.

analytical_engine

What is true, however, is that Babbage did concentrate in his plans and drafts, and the Analytical Engine never got past the plans and drafts phase, on the mathematical aspects of the machine. This however does not mean that Babbage considered it purely as a mathematical machine. I am writing this post on a modern state of the art computer. I also use the same device to exchange views with my history of sciences peers on Twitter and Facebook, to post my outpourings, such as this one, on my Internet blog. I can telephone, with visual contact if I choose, with people all over the world using Skype. At the touch, or two, of a keyboard key I have access to dictionaries, encyclopaedias and all sorts of other reference tools and through various means I can exchange documents, photographs, sound files and videos with anybody who owns a similar device. I can listen to and watch all sorts of music recordings and videos and with easily accessible software even turn my computer into an unbelievably flexible musical instrument. Finally when I’m done for the day I can settle back and watch television on my large, high-resolution monitor screen. This is only a fraction of the tasks that my computer is capable of carrying out but they all have one thing in common, they can all only be accomplished if they are capable of being coded into an astoundingly banal logical language consisting only of ‘0s’ and ‘1s’. Of course between the activities I carry out on my monitor screen and the electrical circuits that are only capable of reading those ‘0s’ and ‘1s’ there are layer upon layer of so-called sub-routines and sub-sub-routines and sub-sub-sub…, you get the idea, translating an upper layer into a simpler logical form until we get all the way down to those ubiquitous ‘0s’ and ‘1s’. The language in which those ‘0s’ and ‘1s’ exist is a mathematical language, known as Boolean Algebra, and so in the final analysis my super smart ultra modern computer is nothing but a super number cruncher and only two numbers at that.

Babbage, a brilliant mathematician, was well aware that he could only programme his Engine to carry out tasks that could be reduced over a series of steps to a mathematical language and this is the reason he concentrated on the mathematical aspects of his machine but this by no means meant that he only conceived of it only carrying out mathematical tasks, as we will see when addressing Essinger’s second prong.

Essinger quotes the following passage from Note A of the Malebrea translation:

In studying the action of the Analytical Engine, we find that the peculiar and independent nature of the considerations which in all mathematical analysis belong to operations, as distinguished from the objects operated upon and from the results of the operations performed upon those objects, is very strikingly defined and separated.

It is well to draw attention to this point, not only because its full appreciation is essential to the attainment of any very just and adequate general comprehension of the powers and mode of action of the Analytical Engine, but also because it is one which is perhaps too little kept in view in the study of mathematical science in general. It is, however, impossible to confound it with other considerations, either when we trace the manner in which that engine attains its results, or when we prepare the data for its attainment of those results. It were much to be desired, that when mathematical processes pass through the human brain instead of through the medium of inanimate mechanism, it were equally a necessity of things that the reasonings connected with operations should hold the same just place as a clear and well-defined branch of the subject of analysis, a fundamental but yet independent ingredient in the science, which they must do in studying the engine. The confusion, the difficulties, the contradictions which, in consequence of a want of accurate distinctions in this particular, have up to even a recent period encumbered mathematics in all those branches involving the consideration of negative and impossible quantities, will at once occur to the reader who is at all versed in this science, and would alone suffice to justify dwelling somewhat on the point, in connexion with any subject so peculiarly fitted to give forcible illustration of it as the Analytical Engine.

Attributing its contents to Ada he makes the following comment, “What Ada is emphasising here is the clear distinction between data and data processing: a distinction we tend to take for granted today, but which – like so much of her thinking about computers –was in her own day not only revolutionary but truly visionary”. What is being described here is indeed new in Ada’s day but is a well known development in mathematics know at the time as the Calculus of Operations, a branch of mathematics developed in the first half of the nineteenth century, which differentiates between operators and operations, and in which Babbage worked and to which he made contributions. If the ideas contained in this passage are indeed visionary then the vision is Babbage’s being channelled by Ada and not originating with her. The words might be Ada’s but the thoughts they express are clearly Babbage’s.

Essinger now quotes the next part of the Note:

It may be desirable to explain, that by the word operation, we mean any process which alters the mutual relation of two or more things, be this relation of what kind it may. This is the most general definition, and would include all subjects in the universe. In abstract mathematics, of course operations alter those particular relations which are involved in the considerations of number and space, and the results of operations are those peculiar results which correspond to the nature of the subjects of operation. But the science of operations, as derived from mathematics more especially, is a science of itself, and has its own abstract truth and value; just as logic has its own peculiar truth and value, independently of the subjects to which we may apply its reasonings and processes.

Essinger now reaches maximum bullshit level, “Ada is seeking to do nothing less than invent the science of computing and separate it from the science of mathematics. What she calls ‘the science of operations’ is indeed in effect computing”. As I have already explained what she calls the ‘science of operations’ is in fact the calculus of operation a new but well developed branch of mathematics of which Babbage was fully cognisant. If anybody is inventing the science of computing it is once again Babbage and not Ada.

Essinger now takes up the case further along in Note A:

The distinctive characteristic of the Analytical Engine, […]is the introduction into it of the principle which Jacquard devised for regulating, by means of punched cards, the most complicated patterns in the fabrication of brocaded stuffs… […]The bounds of arithmetic [emphasis in original] were however outstepped the moment the idea of applying the cards had occurred; and the Analytical Engine does not occupy common ground with mere “calculating machines.” It holds a position wholly its own; and the considerations it suggests are most interesting in their nature. In enabling mechanism to combine together general [emphasis in original] symbols in successions of unlimited variety and extent, a uniting link is established between the operations of matter and the abstract mental processes of the most abstract [emphasis in original] branch of mathematical science. [Ellipsis in quote by Essinger]

Essinger introduces this quote with the following: “In a terse passage she explains (perhaps better than Babbage ever could, who as designer saw many trees but perhaps no longer the forest itself) the essential relationship between the Analytical Engine and the Jacquard loom and how it is different from the earlier invention”. After the quote he then writes: “In perhaps one of the most visionary sentences written during the nineteenth century [he sure doesn’t hold back on the hyperbole], she lays out what these cards shall be capable of doing by way of programming the machine”

First off, if you put back the bits Essinger removed from this passage it is anything but terse, in fact it’s rather verbose. Is Essinger really trying to tell us that Babbage was not aware of what he was doing when he conceived of programming his Engine with punch cards? Unfortunately for Essinger Babbage himself tells us that this is not the case, writing in his notebook on 10 July 1836, that is 8 years before the original French version of the Malebrea essay was published, he has the following to say:

This day I had for the first time a general but very indistinct conception of the possibility of making the engine work out algebraic developments – I mean without any reference to the value of the letters. My notion is that as the cards (Jacquards) of the calc. engine direct a series of operations and the recommence with the first…[2]

Here we have in Babbage’s own words the germ of the idea contained in the Ada quote, an idea that would naturally mature over the intervening nine years before Ada wrote her piece, so I have problems whatsoever in again attributing the thoughts contained here to Babbage.

I’m not going to go on analysing Essinger’s Ada hagiography for almost all of the things that he attributes to Ada it is not difficult to find its origins in Babbage’s work thus reinforcing the claim in an earlier post that Ada is being used here as Babbage’s mouth piece. Not so much the originator as the parrot. I will however close with one last quote from Note A and Essinger’s comment to demonstrate that his grasp of the history of science in the nineteenth century is apparently almost non-existent. Without really introducing it Essinger quotes the following sentence:

Those who view mathematical science, not merely as a vast body of abstract and immutable truths, whose intrinsic beauty, symmetry and logical completeness, when regarded in their connexion together as a whole, entitle them to a prominent place in the interest of all profound and logical minds, but as possessing a yet deeper interest for the human race, when it is remembered that this science constitutes the language through which alone we can adequately express the great facts of the natural world, and those unceasing changes of mutual relationship which, visibly or invisibly, consciously or unconsciously to our immediate physical perceptions, are interminably going on in the agencies of the creation we live amidst: those who thus think on mathematical truth as the instrument through which the weak mind of man can most effectually read his Creator’s works, will regard with especial interest all that can tend to facilitate the translation of its principles into explicit practical forms.

Essinger wonderingly comments on this sentence, “This 158-word sentence is very likely one of the longest sentences in the history of science, but it is also one of the most intriguing. Ada succeeds in this one sentence in linking mathematics, science, religion and philosophy.” Any competent historian of science would immediately recognise this as a rather flowery expression of the basic tenets of natural theology, a philosophy that flourished in the first half of the nineteenth century. This statement could have been made by a very large number of natural philosophers starting with Isaac Newton and going up to and beyond William Whewell and Charles Babbage, for example in the dispute that I outlined on this day last year. What this example clearly illustrates is that Essinger is in no way a real historian who researches and understands his sources but one who thinks he can read the text of Note A and interpret it on the basis of his lack of knowledge rather than on his procession of it.

[1] The copy I read was James Essinger, A Female Genius: how Ada Lovelace, Lord Byron’s daughter started the computer age, London 2015

[2] Babbage notebook quote taken from Dorothy Stein, Ada: A Life and a Legacy, MIT Press, Cambridge Massachusetts &London, 1985 p.102

6 Comments

Filed under History of Computing, Myths of Science

A double bicentennial – George contra Ada – Reality contra Perception

The end of this year sees a double English bicentennial in the history of computing. On 2 November we celebrate the two hundredth anniversary of the birth of mathematician and logician Georg Boole then on 10 December the two hundredth anniversary of the birth of ‘science writer’ Augusta Ada King, Countess of Lovelace. It is an interesting exercise to take a brief look at how these two bicentennials are being perceived in the public sphere.

As I have pointed out in several earlier posts Ada was a member of the minor aristocracy, who, although she never knew her father, had a wealthy well connected mother. She had access to the highest social and intellectual circles of early Victorian London. Despite being mentored and tutored by the best that London had to offer she failed totally in mastering more than elementary mathematics. So, as I have also pointed out more than once, to call her a mathematician is a very poor quality joke. Her only ‘scientific’ contribution was to translate a memoire on Babbage’s Analytical Engine from French into English to which are appended a series of new notes. There is very substantial internal and external evidence that these notes in fact stem from Babbage and not Ada and that she only gave them linguistic form. What we have here is basically a journalistic interview and not a piece of original work. It is a historical fact that she did not write the first computer programme, as is still repeated ad nauseam every time her name is mentioned.

However the acolytes of the Cult of the Holy Saint Ada are banging the advertising drum for her bicentennial on a level comparable to that accorded to Einstein for the centenary of the General Theory of Relativity. On social media ‘Finding Ada’ are obviously planning massive celebrations, which they have already indicated although the exact nature of them has yet to be revealed. More worrying is the publication of the graphic novel The Thrilling Adventures of Lovelace and Babbage: The (Mostly) True Story of the First Computer (note who gets first billing!) by animator and cartoonist Sydney Padua. The Analytical Engine as of course not the first computer that honour goes to Babbage’s Difference Engine. More important Padua’s novel is not even remotely ‘mostly’ true but largely fictional. This wouldn’t matter that much if said book had not received major media attention. Attention that compounded the error by conveniently forgetting the mostly. The biggest lie in the work of fiction is the claim that Ada was somehow directly involved in the conception and construction of the Analytical engine. In reality she had absolutely nothing to do with either its conception or its construction.

This deliberate misconception has been compounded by a, in social media widely disseminated, attempt to get support for a Lovelace, Babbage Analytical Engine Lego Set. The promoter of this enterprise has written in his blurb:

Ada Lovelace (1815-1852) is widely credited as the first computer scientist and Charles Babbage (1791-1871) is best remembered for originating the concept of a programmable computer. Together they collaborated on Babbage’s early mechanical general-purpose computer, the Analytical Engine.

Widely credited by whom? If anybody is the first computer scientist in this set up then it’s Babbage. Others such as Leibniz speculated on what we now call computer science long before Ada was born so I think that is another piece of hype that we can commit to the trashcan. Much more important is the fact that they did not collaborate on the Analytical Engine that was solely Babbage’s baby. This factually false hype is compounded in the following tweet from 21 July, which linked to the Lego promotion:

Historical lego [sic] of Ada Lovelace’s conception of the first programmable computer

To give some perspective to the whole issue it is instructive to ask about what in German is called the ‘Wirkungsgeschichte’, best translated as historical impact, of Babbage’s efforts to promote and build his computers, including the, in the mean time, notorious Menabrea memoire, irrespective as to who actually formulated the added notes. The impact of all of Babbage’s computer endeavours on the history of the computer is almost nothing. I say almost because, due to Turing, the notes did play a minor role in the early phases of the post World War II artificial intelligence debate. However one could get the impression from the efforts of the Ada Lovelace fan club, strongly supported by the media that this was a highly significant contribution to the history of computing that deserves to be massively celebrated on the Lovelace bicentennial.

Let us now turn our attention to subject of our other bicentennial celebration, George Boole. Born into a working class family in Lincoln, Boole had little formal education. However his father was a self-educated man with a thirst for knowledge, who instilled the same characteristics in his son. With some assistance he taught himself Latin and Greek and later French, German and Italian in order to be able to read the advanced continental mathematics. His father went bankrupt when he was 16 and he became breadwinner for the family, taking a post as schoolmaster in a small private school. When he was 19 he set up his own small school. Using the library of the local Mechanics Institute he taught himself mathematics. In the 1840s he began to publish original mathematical research in the Cambridge Mathematical Journal with the support of Duncan Gregory, a great great grandson of Newton’s contemporary James Gregory. Boole went on to become one of the leading British mathematicians of the nineteenth century and despite his total lack of formal qualifications he was appointed Professor of Mathematics at the newly founded Queen’s College of Cork in 1849.

Although a fascinating figure in the history of mathematics it is Boole the logician, who interests us here. In 1847 Boole published the first version of his logical algebra in the form of a largish pamphlet, Mathematical Analysis of Logic. This was followed in 1854 by an expanded version of his ideas in his An Investigation of the Laws of Thought, on which are founded the Mathematical Theories of Logic and Probability. These publications contain the core of Boolean algebra, the final Boolean algebra was actually produced by Stanley Jevons, only the second non-standard algebra ever to be developed. The first non-standard algebra was Hamilton’s quaternions. For non-mathematical readers standard algebra is the stuff we all learned (and loved!) at school. Boolean algebra was Boole’s greatest contribution to the histories of mathematics, logic and science.

When it first appeared Boole’s logic was large ignored as an irrelevance but as the nineteenth century progressed it was taken up and developed by others, most notably by the German mathematician Ernst Schröder, and provided the tool for much early work in mathematical logic. Around 1930 it was superseded in this area by the mathematical logic of Whitehead’s and Russell’s Principia Mathematica. Boole’s algebraic logic seemed destined for the novelty scrap heap of history until a brilliant young American mathematician wrote his master’s thesis.

Claude Shannon (1916–2001) was a postgrad student of electrical engineering of Vannevar Bush at MIT working on Bush’s electro-mechanical computer the differential analyzer. Having learnt Boolean algebra as an undergraduate Shannon realised that it could be used for the systematic and logical design of electrical switching circuits. In 1937 he published a paper drawn from his master’s thesis, A Symbolic Analysis of Relay and Switching Circuits. Shannon switching algebra, applied Boolean algebra, would go on to supply the basis of the hardware design of all modern computers. When people began to write programs for the computers designed with Shannon’s switching algebra it was only natural that they would use Boole’s two-valued (1/0, true/false, on/off) algebra to write those programs. Almost all modern computers are both in their hardware and there software applied Boolean algebra. One can argue, as I have actually done somewhat tongue in cheek in a lecture, that George Boole is the ‘father’ of the modern computer. (Somewhat tongue in cheek, as I don’t actually like the term ‘father of’). The modern computer has of course many fathers and mothers.

In George Boole, as opposed to Babbage and Lovelace, we have a man whose work made a massive real contribution to history of the computer and although both the Universities of Cork and Lincoln are planning major celebration for his bicentennial they have been, up till now largely ignored by the media with the exception of the Irish newspapers who are happy to claim Boole, an Englishman, as one of their own.

The press seems to have decided that a ‘disadvantaged’ (she never was, as opposed to Boole) female ‘scientist’, who just happens to be Byron’s daughter is more newsworthy in the history of the computer than a male mathematician, even if she contributed almost nothing and he contributed very much.

8 Comments

Filed under History of Computing, History of Mathematics, Ladies of Science, Myths of Science

Creating a holy cow.

Whenever I think that the deification of Ada Lovelace can’t get anymore ridiculous somebody comes along and ups the ante. The latest idiocy was posted on Twitter by the comedian Stephen Fry (of whom I’m a big fan!). Mr Fry tweeted:

Ada Lovelace & Alan Turing for the next £20 note! Nominate here [link removed] Heroic pioneers in the face of prejudice. [my emphasis]

My comments will only concern Augusta Ada King, Countess of Lovelace, although the comment I have highlighted also has issues when applied to Alan Turing.

Heroic pioneers in the face of prejudice. Let us briefly examine the prejudice that the Countess of Lovelace, née Byron, suffered. Born into the English aristocracy she unfortunately lost her “mad, bad and dangerous to know” father at the tender age of one month. However her mother’s family were extremely wealthy, the main reason Byron who was destitute had married her, and so Ada lacked for nothing throughout her childhood. It should be also pointed out that her mother enjoyed a very high social status, despite her disastrous marriage.

She was, as a young women, tutored and mentored by the elite of the scientific community in Victorian London, including Charles Babbage, Augustus De Morgan, Sir Charles Wheatstone and Mary Somerville, all of whom helped and encouraged her in her scientific studies. She married the wealthy Baron William King who was soon elevated to Earl of Lovelace and who also supported her scientific endeavours without any restrictions. Somehow I fail to see to what the term prejudice could possibly be referring. Rich, pampered and supported by the very elite of London’s scientific community doesn’t sound like prejudice to me.

It was Wheatstone who suggested that she translate the Menabrea memoire on the Analytical Engine in emulation of her mentor Mary Somerville’s translation of Laplace, a far greater and much more complex work. So there is no suggestion of the pioneer here. Somerville herself was just one of several women, albeit the greatest, who wrote works popularizing the mathematical sciences in England in the first half of the nineteenth century. So Ada was in no way a pioneer but rather following the crowd.

It might be argued that her notations to the memoire qualify her as a pioneer, however I remain firmly convinced that the notes were very much a Babbage-Lovelace co-production with Babbage providing the content and Lovelace the turns of phrase. At best she was a scientific journalist or communicator. The pioneer was Babbage. There is strong evidence to support this interpretation, which gets swept under the carpet by the acolytes of the Cult of the Holy Saint Ada.

I shall be writing a longer post on one central aspect of the cult’s mythologizing later in the summer so stayed tuned.

2 Comments

Filed under History of Computing, Myths of Science

The worst history of technology headline of the year?

The Guardian website produced a couple of articles to announce the publication of Sydney Padua’s graphic novel, The Thrilling Adventures of Lovelace and Babbage: The (Mostly) True Story of the First Computer. I strongly suspect that despite Padua’s qualifying ‘mostly’ in her subtitle what we will be presented with here bears very little relation to the historical facts. However, not actually having read the book, it is not the subject of this brief post but rather the Guardian article. This article is crowned with the following headline:

Ada Lovelace and Charles Babbage designed a computer in the 1840s.

A cartoonist finishes the project.

 Can you spot the major howler in the very brief first sentence? Who designed a computer? Charles Babbage designed a computer. Ada Lovelace wrote a puff piece about that computer, which was in all probability largely ghost-written by Babbage. Just in case you should think that this was an inadvertent slip of a subeditor’s thumb on his computer keyboard the claim is repeated even more emphatically in the title of an illustration to the article.

200 years after Ada Lovelace’s birth, the Analytical Engine she designed with Charles Babbage is finally built, thanks to the imagination of Sydney Padua. Illustration: The Observer

In case you should still think that the writer of the piece could or should be excused of all blame, embarrassed by the hyperbolic flights of fancy of that technology history ignorant subeditor, we find the following in the main body of the article.

Brought up to shun the lure of poetry and revel instead in numbers, Lovelace teamed up with mathematician Charles Babbage who had grand plans for an adding machine, named the Difference Engine, and a computer called the Analytical Engine, for which Lovelace wrote the programs.

Where to begin? First off both the Difference Engine and the Analytical Engine are computers. The former a special purpose computer and the latter a general purpose one. Babbage would have been deeply offended having his mighty Difference Engine denigrated to a mere adding machine, although all computers are by name adding machines; computer coming, as it does, from the Latin computare which means to reckon/compute/calculate, sum/count (up). As a brief aside, when the word computer was coined in the 17th century it referred to a person employed to do calculations. Second, and in this context most important, Lovelace did not write the programs for the Analytical Engine. The afore mentioned puff piece from her pen contained one, note the singular, specimen program for the Analytical Engine, which she might possibly have written, although it seems more probable that Babbage wrote it. All the other programs for the Analytical Engine, and there were others were written by, you’ve guessed it, Charles Babbage.

The deification of Ada Lovelace marches on a pace with the honest historian of the computer barely able to keep pace with the waves of mythology that pour out of the unsavvy media almost every day it seems.

14 Comments

Filed under History of Computing, Myths of Science

Mega inanity

Since the lead up to the Turing centennial in 2012 celebrating the birth of one of the great meta-mathematicians of the twentieth century, Alan Mathison Turing, I have observed with increasing horror the escalating hagiographic accounts of Turing’s undoubted historical achievements and the resulting perversion of the histories of twentieth-century science, mathematics and technology and in particular the history of computing.

This abhorrence on my part is not based on a mere nodding acquaintance with Turing’s name but on a deep and long-time engagement with the man and his work. I served my apprenticeship as a historian of science over many years in a research project on the history of formal or mathematical logic. Formal logic is one of the so-called formal sciences the others being mathematics and informatics (or computer science). I have spent my whole life studying the history of mathematics with a special interest in the history of computing both in its abstract form and in its technological realisation in all sorts of calculating aids and machines. I also devoted a substantial part of my formal study of philosophy to the study of the philosophy of mathematics and the logical, meta-logical and meta-mathematical problems that this discipline, some would say unfortunately, generates. The history of all of these intellectual streams flow together in the first half of the twentieth century in the work of such people as Leopold Löwenheim, Thoralf Skolem, Emil Post, Alfred Tarski, Kurt Gödel, Alonso Church and Alan Turing amongst others. These people created a new discipline known as meta-mathematics whilst carrying out a programme delineated by David Hilbert.

Attempts to provide a solid foundation for mathematics using set theory and logic had run into serious problems with paradoxes. Hilbert thought the solution lay in developing each mathematical discipline as a strict axiomatic systems and then proving that each axiomatic system possessed a set of required characteristics thus ensuring the solidity and reliability of a given system. This concept of proving theories for complete axiomatic systems is the meta- of meta-mathematics. The properties that Hilbert required for his axiomatic systems were consistency, which means the systems should be shown to be free of contradictions, completeness, meaning that all of the theorems that belong to a particular discipline are deductible from its axiom system, and finally decidability, meaning that for any well-formed statement within the system it should be possible to produced an algorithmic process to decide if the statement is true within the axiomatic system or not. An algorithm is like a cookery recipe if you follow the steps correctly you will produce the right result.

The meta-mathematicians listed above showed by very ingenious methods that none of Hilbert’s aims could be fulfilled bringing the dream of a secure foundation for mathematics crashing to the ground. Turing’s solution to the problem of decidability is an ingenious thought experiment, for which he is justifiably regarded as one of the meta-mathematical gods of the twentieth century. It was this work that led to him being employed as a code breaker at Bletchley Park during WW II and eventually to the fame and disaster of the rest of his too short life.

Unfortunately the attempts to restore Turing’s reputation since the centenary of his birth in 2012 has led to some terrible misrepresentations of his work and its consequences. I thought we had reach a low point in the ebb and flow of the centenary celebrations but the release of “The Imitation Game”, the Alan Turing biopic, has produced a new series of false and inaccurate statements in the reviews. I was pleasantly pleased to see several reviews, which attempt to correct some of the worst historical errors in the film. You can read a collection of reviews of the film in the most recent edition of the weekly histories of science, technology and medicine links list Whewell’s Gazette. Not having seen the film yet I can’t comment but I was stunned when I read the following paragraph from the abc NEWS review of the film written by Alyssa Newcomb. It’s so bad you can only file it under; you can’t make this shit up.

The “Turing Machine” was the first modern computer to logically process information, running on interchangeable software and essentially laying the groundwork for every computing device we have today — from laptops to smartphones.

Before I analyse this train wreck of a historical statement I would just like to emphasise that this is not the Little Piddlington School Gazette, whose enthusiastic but slightly slapdash twelve-year-old film critic got his facts a little mixed up, but a review that appeared on the website of a major American media company and as such totally unacceptable however you view it.

The first compound statement contains a double whammy of mega-inane falsehood and I had real problems deciding where to begin and finally plumped for the “first modern computer to logically process information, running on interchangeable software”. Alan Turing had nothing to do with the first such machine, the honour going to Konrad Zuse’s Z3, which Zuse completed in 1941. The first such machine in whose design and construction Alan Turing was involved was the ACE produced at the National Physical Laboratory, in London, in 1949. In the intervening years Atanasoff and Berry, Tommy Flowers, Howard Aikin, as well as Eckert and Mauchly had all designed and constructed computers of various types and abilities. To credit Turing with the sole responsibility for our digital computer age is not only historically inaccurate but also highly insulting to all the others who made substantial and important contributions to the evolution of the computer. Many, many more than I’ve named here.

We now turn to the second error contained in this wonderfully inane opening statement and return to the subject of meta-mathematics. The “Turing Machine” is not a computer at all its Alan Turing’s truly genial thought experiment solution to Hilbert’s decidability problem. Turing imagined a very simple machine that consists of a scanning-reading head and an infinite tape that runs under the scanning head. The head can read instructions on the tape and execute them, moving the tape right or left or doing nothing. The question then reduces to the question, which set of instructions on the tape come eventually to a stop (decidable) and which lead to an infinite loop (undecidable). Turing developed this idea to a machine capable of computing any computable function (a universal Turing Machine) and thus created a theoretical model for all computers. This is of course a long way from a practical, real mechanical realisation i.e. a computer but it does provide a theoretical measure with which to describe the capabilities of a mechanical computing device. A computer that is the equivalent of a Universal Turing Machine is called Turing complete. For example, Zuse’s Z3 was Turing complete whereas Colossus, the computer designed and constructed by Tommy Flowers for decoding work at Bletchley Park, was not.

Turing’s work played and continues to play an important role in the theory of computation but historically had very little effect on the development of real computers. Attributing the digital computer age to Turing and his work is not just historically wrong but is as I already stated above highly insulting to all of those who really did bring about that age. Turing is a fascinating, brilliant, and because of what happened to him because of the persecution of homosexuals, tragic figure in the histories of mathematics, logic and computing in the twentieth century but attributing achievements to him that he didn’t make does not honour his memory, which certainly should be honoured, but ridicules it.

I should in fairness to the author of the film review, that I took as motivation from this post, say that she seems to be channelling misinformation from the film distributors as I’ve read very similar stupid claims in other previews and reviews of the film.

15 Comments

Filed under History of Computing, History of Logic, History of Mathematics, Myths of Science

Oh please!

The latest move in the canonisation of Alan Turing is an opera, or whatever, written by the Pet Shop Boys, which is being heavily promoted by a PR campaign launched yesterday. As part of this press onslaught this magazine cover appeared on my Twitter stream today.

BmDT0wnCIAAeAwH.jpg-large

For the record, as a fan and one time student of meta-mathematics I was aware of and to some extent in awe of Alan Turing long before most of the people now trying to elevate him into Olympus even knew he existed. He was without a shadow of a doubt one of the most brilliant logicians of the twentieth-century and he along with others of his ilk, such as Leopold Löwenheim, Thoralf Skolem, Emil Post, Kurt Gödel, Alonzo Church etc. etc., who laid the theoretical foundations for much of the computer age, all deserve to be much better known than they are, however the attempts to adulate Turing’s memory have become grotesque. The Gay Man Who Saved the World is hyperbolic, hagiographic bullshit!

Turing made significant contributions to the work of Bletchley Park in breaking various German codes during the Second World War. He was one of nine thousand people who worked there. He did not work in isolation; he led a team that cracked one version of the Enigma Code. To what extent the work of Bletchley Park contributed to the eventual Allied victory is probably almost impossible to assess or quantify.

Alan Turing made significant contributions to the theories of meta-mathematics and an equally significant contribution to the British war effort. He did not, as is frequently claimed by the claqueur, invent the computer and he most certainly did not “save the world”. Can we please return to sanity in our assessment of our scientific heroes?

19 Comments

Filed under History of Computing, History of Mathematics, Myths of Science