Category Archives: Myths of Science

I expect better of you Beinecke

The Beinecke Rare Book & Manuscript Library is the rare book library and literary archive of the Yale University Library. Yesterday their Twitter account posted a tweet entitled GalileoSiderius Nunc, which linked to a blog post from July 11, 2022, by Raymond Clemens, Curator, Early Books & Manuscripts. 

It featured one of Galileo’s famous washes of the Moon from his Sidereus Nuncius (1610) followed by a short text.

Above: Detail, p. 18. Galileo, Siderevu nvncivs, QB41 G33 1610, copy 2. 

Our mini-exhibits end with the vitrine holding several copies of Galileo’s first printed images of the moon, the first ever made with the benefit of the telescope. For the first time, most Europeans were shown the dark side of the moon. Galileo’s sketches also emphasize its barren and rocky nature—well known to us today, but something of a revelation in the sixteenth century, when most people thought of the moon as another planet, thus generating its own light. Galileo was the first person to accurately depict the moons of Jupiter (which he called “Medicean stars,” after his patron, the Florentine Medici family). A photograph at the back of the vitrine was taken in 1968, before humans landed on the moon. It shows Earth as seen from the moon—the first time we saw our own planet from another astronomical body. This rough black and white image eerily resembles Galileo’s lunar landscape.

It is a mere 152 words long, not much room for errors, one might think, but one would be wrong.

We start with the heading. The title of Galileo’s book is Sidereus Nuncius and there one really shouldn’t shorten Nuncius to Nunc, as this actually changes the meaning from message or messenger to now! Also, it is Sidereus not Siderius!

Addendum: A reader on Twitter, more observant than I, has pointed out, correctly, that 1609 and 1610 are in the seventeenth century and not the sixteenth century as stated by Clemens.

In the first line Clemens writes: Galileo’s first printed images of the moon, the first ever made with the benefit of the telescope. I shall be generous and assume that with this ambiguous phrase he means first ever printed images made with the benefit of the telescope. If, however, he meant first ever images made with the benefit of the telescope, then he would be wrong as that honour goes Thomas Harriot.

The real hammer comes in the next sentence, where he writes:

For the first time, most Europeans were shown the dark side of the moon.

The first time I read this, I did a double take, could a curator of the Beinecke really have written something that mind bogglingly stupid? By definition the dark side of the moon is the side of the moon that can never be seen from the earth. The first images of it were made, not by Galileo in 1609, after all how could he, but by the Soviet Luna 3 space probe in 1959, 350 years later. 

The problems don’t end here, he writes:

Galileo’s sketches also emphasize its barren and rocky nature—well known to us today, but something of a revelation in the sixteenth century, when most people thought of the moon as another planet, thus generating its own light.

In the geocentric system the moon was indeed regarded as one of the seven planets, but in the heliocentric system, which Galileo promoted, it had become a satellite of the earth and was no longer considered a planet. There was a long and complicated discussion throughout the history of astronomy as to whether the planets generated their own light or not. However, within Western astronomy there was a fairy clear consensus that the moon reflected sunlight rather than generating its own light. A brief sketch of the history of this knowledge starts with Anaxagoras (d. 428 BCE). The great Islamic polymath Ibn al-Haytham (965–1039) clearly promoted that the moon reflected sunlight. In the century before Galileo, Leonardo (1452–1519) in his moon studies clearly stated that the moon was illuminated by reflected sunlight. However, he never published. 

Maybe, Clemens is confusing this with the first recognition of the true cause of earth shine, the faint light reflected from the earth that makes the whole moon visible during the first crescent, a recognition that is often falsely attributed to Galileo. However, here the laurels go to Leonardo but who, as always, didn’t publish. The first published correct account was made by Michael Mästlin (1550–1631)

 Clemens’ next statement appears to me to be simply bizarre:

Galileo was the first person to accurately depict the moons of Jupiter (which he called “Medicean stars,” after his patron, the Florentine Medici family).

Galileo was the first to discover the moons of Jupiter, just one day ahead of Simon Marius, but to state that he accurately depicted them is somewhat more than an exaggeration. For Galileo and Marius, the moons of Jupiter were small points of light in the sky, the positions of which they recorded as ink dots on a piece of paper. To call this accurate depiction is a joke.

Somehow, I expect a higher standard of public information from the Beinecke Library, one of the world’s leading rare book depositories. 

Addendum 18:30 CEST: The post on the Beinecke blog that this post refers to has now been heavily edited. Everything I criticised has been either removed or corrected but without acknowledgement anywhere!

Renaissance Mathematicus 1 Beinecke Library 0!

6 Comments

Filed under History of Astronomy, Myths of Science

13

Today the Renaissance Mathematicus officially became a teenager, although I think it’s been one since it first emerged into the digital world thirteen years ago, snotty-nosed, stroppy, belligerent, argumentative, anti-authority, whilst at the same time oscillating between bursting with energy and sloth like behaviour. Did I mention self-opinionated and convinced it knows better than everybody else?

Thirteen is, in the Germanic languages, the first number with a compound name, three plus ten, eleven and twelve having single names. It is the sixth prime number and the second two-digit prime forming a twin prime with eleven, the first two digit prime. 

In some countries, including the UK and the USA, thirteen is considered an unlucky number, with people going as far as to not having a thirteenth floor in a building or a room 13 in a hotel. This superstition has been given the wonderful name Triskaidekaphobia from the Ancient Greek treiskaídeka for thirteen and phóbos meaning fear. There are various attempts to explain the historical origins of this phobia but none of them can actually be substantiated. Friday 13th is considered particularly unlucky in these cultures and has the equally splendid name paraskevidekatriaphobia from the Greek Paraskevi for Friday, reiskaídeka for thirteen, and phóbos meaning fear. In the Gregorian calendar, Friday 13th occurs at least once every year and can occur up to three times. Although there is evidence of both Friday and thirteen being considered unlucky, the earliest reference to Friday 13th as unlucky is in the nineteenth century. Once again, the origin of the superstition is a mater of speculation. 

One common occurrence of the number thirteen in the English language is the baker’s dozen. Whereas a dozen is a group of twelve, a baker’s dozen is a group of thirteen. The term dates back to the fifteenth century and refers to the habit of baker’s selling their wares in units of thirteen rather than twelve as the law required. As bakers could be fined for selling their wares underweight, it is thought that they included an extra item to avoid the risk of a fine.

As usual the Renaissance Mathematicus blog anniversary is an occasion for reflection, looking inward and questioning, a period of introspection. Why do I do this at all? What is my motivation? What do I hope to achieve? 

I’ve actually been thinking about these questions for sometime now. I am a self-confessed music junkie, who has spent a large part of my life working as a very small cog in the music business, as a stagehand, club live sound man, jazz club manager and chief cook and bottle washer. I also possess an obscenely large album collection, which I relativise by pointing out that other music junkies I know have much larger collections. One of my favourite rock guitarists is Robert Fripp, the genius behind King Crimson. Fripp is very philosophical for a rock musician and one of his sayings is, “don’t become a professional musician unless you can’t do anything else.” This statement is of course ambiguous. It could mean, if you are physically or mentally incapable of doing anything else or on the other hand you are so obsessed that nothing else comes into question. 

I prefer the second interpretation and it neatly sums up my relationship to history in general and the history of science in particular. I have been addicted to history for as long as I can remember, history in general, history of mathematics, history of science, history of food… What ever else I’ve done in my life, I’ve always studied history simply because. However, as I have revealed in the past, I am an AD(H)Dler and this means I tend to get easily distracted in my studies, research, and readings. Oh look, there’s another aspect I could follow up over there and isn’t this fact interesting, maybe I could find out something about that! This means I have in my life a strong tendency never to get anything finished, because there are always twenty other different pathways I want to go down first. Forcing myself to write a weekly blog post helps me to stay focused, to concentrate, and get at least one thing finished.  When I’m not writing blog posts my mind still wanders off in twenty different directions at once, but that’s OK; that’s have I come up with new topics for blog posts. 

All of the above basically covers the first two of my questions, why and motivation and there isn’t really any other explanation. This still leave the third question open; what do I hope to achieve? I don’t really have a general answer to this. I don’t actually think I want to achieve anything in particular. Initially, as I have said in the past, I wanted to teach myself to write, and I think I fulfilled that aim some time ago. I wrote my, The emergence of modern astronomy – a complex mosaic series to prove to myself that if I wrote in slices; I could write a book. Another aim that I think I successfully fulfilled. I might even get around to turning it into a proper book manuscript and trying to find a publisher this summer! The Renaissance Science series was just, you’ve written one long series, what could you write a second one about? 

On the whole I try not to think about potential readers but to write just for myself. This is a safety mechanism to stop me putting myself under any sort of pressure, will I fill my readers expectations!? Of course, I’m happy that people do read my scribblings and some of them even appear to enjoy them. Truth be told, the actual number of people who regularly read this blog scares me somewhat, in particular the successful professional historians of science, who I know do so. Imposter syndrome, what moi? As I have been known to say on occasions, even my imposter syndrome has imposter syndrome. One very concrete thing that I have aimed to achieve with my scribblings since the day I started this blog, is to try and clear away at least some of the myths that plague the popular perception of the history of science. It’s a Sisyphus task but it helps to keep me motivated and focused. 

Having mentioned my readers, I will close this anniversary post by saying I’m grateful for every person, who takes the time to read my weekly outpourings and I hope they gain something for the time taken. I’m also grateful to all those, who take the time to provide feedback, through comments: I thank all of you both readers and commentors and hope you stay on bord for the next twelve months.

7 Comments

Filed under Autobiographical, Myths of Science

Rants, Rage, Rudeness, and Respect

A man that I’ve never come across before, Brett Hall, has taken me to task in, what he terms, a newsletter on YouTube for being rude to Neil deGrasse Tyson. Before somebody drew my attention to his comments, I had absolutely no idea who or what Brett Hall was. It appears he is an Australian, who, it seems, studied about seventeen degrees, I might be exaggerating somewhat, I lost count somewhere down the line in his litany of all the wonderful things he had studied. Anyway, if I understand him correctly, he now regards himself as a science communicator and has a podcast where he explicates and propagates the philosophies of Karl Popper and David Deutsch. He also has a blog and apparently, has recently added a newsletter, in the first edition of which he chose to criticise me. 

I am well acquainted with the works of Karl Raimund Popper, he being one of my first two introductions to the philosophies of mathematics and science, the other was Stephen Körner. I read my first philosophy of science books by both of them in the same week many, many moons ago. I read a large amount of Popper’s oeuvre and a decade later studied him at university. Popper led me to Imre Lakatos, the biggest influence on my personal intellectual development. 

I must admit, because I gave up trying to keep up with all the developments in modern physics quite some time ago, that until about two weeks ago I had never heard of David Deutsch. So that you don’t have to go look, he’s a big name in quantum physics and especially in the theory of quantum computing. Purely by chance, the German news magazine, Der Spiegel, had a long interview with him a couple of weeks ago about his views on epistemology and what he sees as the correct approach to the future and development of scientific thinking. Mr Hall will probably come down on me like a ton of bricks for saying this but, for me, it came across as fairly vacuous, a lot of waffle and pie in the sky. But I’m probably just too stupid to understand the great maestro!  

But back to Mr Hall and good old Neil deGrasse Tyson. Mr Hall bemoaned what he saw as increasing rudeness in debate in the Internet age, a common and widely spread trope, and cited my latest diatribe against NdGT, as an example, misquoting the title of my piece, claiming that I had said that Tyson “knows nothing”, whereas I in fact wrote “knows nothing about nothing”, a wordplay on Tyson’s topic the history of zero. There is a substantial difference between the two statements. He then went on to quote correctly that I accused Tyson of “spouting crap.” Strangely, Mr Hall calls me a science historian, whereas the correct term is historian of science. There is a whole debate within the discipline, as to why it’s the latter and not the former. Even more bizarrely, he states that he is not going to name me and then provides a link to the post on my blog that of course contains my name! I have no problems in being named, I’m old enough and ugly enough to defend myself against all comers.

Mr Hall goes on to explain that he also does not always agree with the theories of NdGT, but that there is no reason not to treat him with respect when stating your disagreement. I have no objection to this statement; however, it misses the point entirely. NdGT is not stating a theory in astrophysics, which is, or rather was, his academic discipline. If he had, I almost certainly would not have commented in any way whatsoever, as I’m not an astrophysicist and so not qualified to pass judgement. No, NdGT was doing something entirely different. On a commercial podcast, for which, given his popularity, he is almost certainly extremely well paid, he was mouthing off extemporaneously about the history of mathematics, a topic about which he very obviously knows very little. He was, as I put it, and there really is no polite way to express, spouting crap, with all the assurance and authority that his prominent public persona gives him. He was literally lying to his listeners, who, I assume, mostly not knowing better believe the pearls of wisdom that drip from his lips. That is serious abuse of his status and of his listeners and deserves no respect whatsoever. 

I would also point out that he is a serial offender and regularly delivers totally ignorant speeches about the history of science and/or mathematics. For example, he regularly repeats, with emphasis, that Newton invented calculus in a couple of weeks, on a dare, which, not to put to finer point on it, is total codswallop. Newton developed his contribution to the evolution of calculus over several years having first read, studied, and digested the work of Descartes, Fermat, Wallace, and Barrow. One can point these things out to NdGT but he simply ignores them and carries on blithely spreading the same tired out falsehoods. He has long ago wilfully squandered any right to be treated with respect, when talking about the history of science and/or mathematics.

Returning to Brett Hall’s basic thesis that academics have jettisoned common decency, politeness, and good manners in the computer age as a result of social media, he expounds on this for the whole of his newsletter, claiming that this behaviour from academics put young people off from entering academia to study the sciences. Like NdGT, Mr Hall appears to have very little knowledge of the history of science. Academics/scholars/scientists, or whatever you want to call them, have been slagging each other off, both publicly and privately, since the first Egyptians put brush to papyrus and the first Babylonians wedge to clay.

Just to take the era in which I claim the most expertise, the emergence of modern astronomy in the Early Modern Period. The two Imperial Mathematici, Tycho Brahe and Nicolaus Reimers Baer laid into each other in a way that makes the HISTSCI_HULK look like a cuddly kitten. A half generation later the next generation, Kepler and Longomontanus, attacked each other with slightly less expletives, but just as much virulence. Galileo laid into anybody and everybody, that he perceived as his enemies and there were many, with invective that would cause a drunken sailor to blush. Moving to the other end of the seventeenth century. Isaac Newton, Lucasian Professor, treated John Flamsteed, Astronomer Royal, like a doormat. In turn, Flamsteed refused to even utter the name of Edmond Halley the Savilian Professor of geometry. Newton and Robert Hooke, demonstrator of experiments at the Royal Society, abused each like a couple of fishwives. Hooke had blazing public rows with virtually every notable scientist in Europe. You get the picture?

In case Mr Hall should argue that modern academics weren’t like that before the advent of the Internet, I could entertain him for hours with anecdotes about the invectives that leading academic archaeologist launched at each other in the early 1970s. One stated that an excavation report by another was about as useful of a mid-Victorian museum guide. The offended party then opened legal proceedings for libel but withdrew them when the offender expressed joy at the prospect of being able to prove his statement under oath in a court of law. I could go on but…

Let us return to myself and my alter ego the HISTSCI_HULK, why do I launch my notorious rants? 

One of my favourite musicians, Robert Fripp, says that one shouldn’t become a professional musician unless one can’t do anything else. This statement is, to say the least, ambiguous. It could mean you lack the ability to do something else, or the compulsion to create music is so great that nothing else comes into question. I have always assumed he intended the second meaning, and this is exactly why I’m a historian of science. The fascination with numbers, number systems, and their origins started very early, at most about five years old, and has simply grown ever since. I can’t explain rationally why I’m fascinated, intrigued, even obsessed by the history of science, I simply am. I have a compulsion to investigate, discover and learn about the history of science so great that nothing else comes into question. 

On a personal level I have always been taught, more by example than anything else, that if one is going to do something then learn to do it properly and then do so. I am from nature a pedant, and I don’t regard pedantry as bad, and a perfectionist. Over the years I have had the good fortune to meet and learn from several excellent teachers, who have helped me to channel that pedantry and perfectionism into my studies and not to accept anything but the best possible.

The history of science is very much a niche discipline within the academic hierarchy and has to battle constantly to justify its existence. There have been and are many excellent historians of science, many of whose books line the walls of my humble abode and nourish my unquenchable thirst for a depth of understanding in the history of science. As I have documented elsewhere, I have a multiple addictive personality and my greatest addiction is without doubt the history of science.

The commercial world of books and television is not interested in the complex and difficult web that is the real history of science, but pop history of science sells well, so they commission not historians of science but scientists to produce pop books and television programmes about the history of science. I mean, after all they are scientists so they must know about the history of their discipline. The results are all to often a disaster. There are exceptions, my friend Matthew Cobb is a professional scientist, who also writes excellent history of science books, several of which adorn my bookshelves. However, the majority of popular history of science books and television programmes are badly researched, shallow perpetuators of myths and inaccuracies–in the Middle Ages the Church opposed science and people believed the world was flat, Newton had an Annus mirabilis and created calculus, and modern optics, physics and astronomy all in one year during the plague, Galileo was persecuted by the Church because he proved that the Earth goes around the Sun, which contradicted the Bible, Ada Lovelace created computer science, and, and, and… A classic example was the original Cosmos television programme from Carl Sagan in which his presentation of the history of astronomy and cosmology was a total and utter cluster fuck, which influenced his tens of million viewers in a very bad way. Whenever I say this on the Internet, I get screamed at by Sagan groupies.

Because I love and live for the discipline, the abuse that it suffers at the hands of these popularises hurts my soul and sets me in a rage causing the HISTSCI_HULK to emerge and go on a rampage. One of the reasons that I do this is because established historians of science are very reluctant to subject these perversions of their discipline to public review. Somehow, they seem to think it is beneath them to engage and point out that the product in question is so much bovine manure. Nobody pays me to be a historian of science, I have no position, no status, and no academic reputation to lose, so I weigh in with all guns blazing and say what I really think. I have a message for Mr Hall and anybody else, who feels offended by my approach, nobody says you have to read it! 

19 Comments

Filed under Autobiographical, Myths of Science

NIL deGrasse Tyson knows nothing about nothing

They are back! Neil deGrasse Tyson is once again spouting total crap about the history of mathematics and has managed to stir the HISTSCI_HULK back into butt kicking action. The offending object that provoked the HISTSCI_HULK’s ire is a Star Talk video on YouTube entitled Neil deGrasse Tyson Explains Zero. The HISTSCI_HULK thinks that the title should read Neil deGrasse Tyson is a Zero!

You simple won’t believe the pearls of wisdom that NdGT spews out for the 1.75 million Star Talk subscribers in a video that has been viewed more than one hundred thousand times. If there ever was a candidate in #histSCI for cancellation, then NdGT is the man.

 Before we deal with NdGT’s inanities, we need some basic information on number systems. Our everyday Hindu-Arabic number system is a decimal, that’s base ten, place value number system, which means that the value of a number symbol is dependent on its place within the number. An example:

If we take the number, 513 it is actually:

 5 x 10+ 1 x 101 + 3 x 100

A quick reminder for those who have forgotten their school maths, any number to the power of zero is 1. Moving from right to left, each new place represents the next higher power of ten, 100, 101, 102, 103, 104, 105, etc, etc. As we will see the Babylonians [as usual, I’m being lazy and using Babylonian as short hand for all the cultures that occupied the Fertile Crescent and used Cuneiform numbers] also had a place value number system, but it was sexagesimal, that’s base sixty, not base ten. It is a place value number system that requires a zero to indicate an empty place. There are in fact two types of zero. The first is simply a placeholder to indicate that this place in the number is empty. The second is the number zero, that which occurs when you subtract a number from itself.

Now on to the horror that is NdGT’s attempt to tell us the history of zero:

HISTSCI_HULK: Not suitable for those who care about the history of maths

 NdGT: I pick these based on how familiar we think we are about the subject and then throw in some things you never knew

HISTSCI_HULK: All NildGT throws in, in this video, is the contents of the garbage pail he calls a brain.

NdGT: For this segment, we’re gonna talk about zero … so zero is a number, but it wasn’t always a number. In fact, no one even imagined how to imagine it, why would you? What were numbers for?

Chuck Nice, Star Talk Host: Right, who counts nothing?

NdGT: Right, numbers are for counting … nobody had any use to count zero … For most of civilisation this was the case. Even through the Roman Empire…

 Here NdGT fails to distinguish between ordinal numbers, which label the place that object take in a list and cardinal numbers which how many things are in a collection or set. A distinction that at one point later will prove crucial.

HISTSCI_HULK: When it comes to the history of mathematics NildGT is a nothing

CN: They were so sophisticated their numbers were letters!

In this supposedly witty remark, we have a very popular misconception. Roman numerals were not actually letters, although in later mutated forms they came to resemble letters. Roman numbers are collections of strokes. One stroke for one, two strokes for two, and so one. To save space and effort, groups of strokes are bundled under a new symbol. The symbol for ten was a crossed or struck out stroke that mutated into an X, the symbol for five, half of ten, was the top half of this X that mutated into a V; originally, they used the bottom half, an inverted V.  The original symbol for fifty was ↓, which mutated into an L and so on. As the Roman number system is not a place value number system it doesn’t require a place holder symbol for zero. If Romans wanted to express total absence, they did so in words not numbers, nulla meaning none. This was first used in a mathematical context in the Early Middle Ages, often simply abbreviated to N. 

NdGT: [Some childish jokes about Roman numeral] … I don’t know if you’ve ever thought about this Chuck, you can’t write zero with Roman numerals. There is no symbol for zero.

The Roman number system is not a place value number system but a stroke counting system that can express any natural number, that’s the simple counting numbers, without the need for a zero. The ancient Egyptian number system was also a stroke counting system, whilst the ancient Greeks used an alpha-numerical system, in which letters do represent the numerals, that also doesn’t require a zero to express the natural numbers.

NdGT: It’s not that they didn’t come up with it, it’s the concept of zero was not yet invented. 

HISTSCI_HULK: I wish NildGT had not been invented yet

This is actually a much more complicated statement than it at first appears. It is true, that as far as we know, the concept of zero as a number had indeed not been invented yet. However, the verbal concept of having none of something had already existed linguistically for millennia. Imaginary conversation, “Can I have five of your flint arrowheads?” Sorry, I can’t help you, I don’t have any at the moment. Somebody came by and took my entire stock this morning.” 

Although the Egyptian base ten stroke numeral system had no zero, by about 1700 BCE, they were using a symbol for zero in accounting texts. Interestingly, they also used the same symbol to indicate ground level in architectural drawings in much the same way that zero is used to indicate the ground floor in European elevators. 

Also, the place holder zero did exist during the time of the Roman Empire. The Babylonian sexagesimal number system emerged in the third millennium BCE and initially did not have a zero of any sort. This meant that the number 23 (I’m using Hindu-Arabic numerals to save the bother of trying to format Babylonian ones) could be both 2 x 601 + 3 x 600 = 123 in decimal, or 2 x 602 + 3 x 600 = 7203 in decimal. They apparently relied on context to know which was correct. By about 700 BCE the first placeholder zero appeared in the system and by about 300 BCE placeholder zeros had become standard. 

During the Roman Empire, the astronomer Ptolemaeus published his Mathēmatikē Syntaxis, better known as the Almagest, around 150 CE, which used a weird number system. The whole number part of numbers were written in a ten-base system in Greek alphanumerical symbols, whereas fractional parts were written in the Babylonian sexagesimal number system, with the same symbols, with a placeholder zero in the form of small circle, ō.

HISTSCI_HULK NildGT now takes off into calendrical fantasy land.

NdGT: So, when they made the Julian calendar, that’s the one that has a leap day every four years, … That calendar … that anchored its starter date on the birth of Jesus, so this obviously came later after Constantine, I think that Constantine brought Christianity to the Roman Empire. So, in the Julian calendar they went from 1 BC, BC, of course, stands for before Christ, to AD 1, and AD is in Latin, Anno Domini the year of our Lord 1, and there was no year zero in that transition. So, when would Jesus have been born? In the mythical year between the two? He can’t be born in AD 1 cause that’s after and he can’t be born in 1 BC, because that’s before, so that’s an issue.

CN: I’ve got the answer, it’s a miracle.

The Julian calendar was of course introduced by Julius Caesar in AUC 708 (AUC is the number of years since the theoretical founding date of Rome) or as we now express it in 44 BCE. The Roman’s didn’t really have a continuous dating system, dating things by the year of the reign of an emperor. Constantine did not bring Christianity to the Roman Empire, he legalised it. Both Jesus and Christianity were born in Judea a province of the Roman Empire, so it was there from its very beginnings. For more on Constantine and Christianity, I recommend Tim O’Neill’s excellent History for Atheists Blog. 

To quote myself in another blog post criticising NdGT’s take on the Gregorian calendar

The use of Anno Domini goes back to Dionysius Exiguus (Dennis the Short) in the sixth century CE in his attempt to produce an accurate system to determine the date of Easter. He introduced it to replace the use of the era of Diocletian used in the Alexandrian method of calculating Easter, because Diocletian was notorious for having persecuted the Christians. Dionysius’ system found very little resonance until the Venerable Bede used it in the eight century CE in his Ecclesiastical History of the English People. Bede’s popularity as a historian and teacher led to the gradual acceptance of the AD convention. BC created in analogy to the AD convention didn’t come into common usage until the late seventeenth century CE. [Although BC does occur occasionally in late medieval chronicles.]

As NdGT says Anno Domini translates as The Year of Our Lord, so Jesus was born in AD 1 the first year of our Lord, simple isn’t it. 

I wrote a whole blog post about why you can’t have a year zero, but I’ll give an abbreviated version here. Although we speak them as cardinal numbers, year numbers are actually ordinal numbers so 2022 is the two thousand and twenty second year of the Common Era. You can’t have a zeroth member of a list. The year zero is literally a contradiction in terms, it means the year that doesn’t exist. 

HISTSCI_HULK You can’t count on NilDGT

NdGT: So now, move time forward. Going, it was in the six hundreds, seven hundreds, I’ve forgotten exactly when. In India, there were great advances in mathematics there and they even developed the numerals, early versions of the numerals we now use, rather than Roman numerals. Roman numerals were letters [no they weren’t, see above], these were now symbolic shapes that would then represent the numbers. In this effort was the hint that maybe you might want a zero in there. So, we’re crawling now before we can walk, but the seeds are planted. 

We have a fundamental problem dating developments in Hindu mathematics because the writing materials they used don’t survive well, unlike the Babylonian clay tablets. The decimal place value number system emerged some time between the first and fourth centuries CE. The symbols used in this system evolved over a long period and the process is too complex to deal with here. 

The earliest known reference to a placeholder zero in Indian mathematics can be found throughout a commercial arithmetic text written on birch bark, the Bakhshali manuscript, the dating of which is very problematical and is somewhere between the third and seventh centuries CE. 

The Aryasiddhanta a mathematical and astronomical work by Āryabhaṭa (476–550 n. Chr.) uses a decimal place value number system but written with alphanumerical symbols and without a zero. The Āryabhaṭīyabhāṣya another mathematical and astronomical work by Bhāskara I (c. 600–c. 680 n. Chr.) uses a decimal place value number system with early Hindu numerals and a zero. With the Brāhmasphuṭasiddhānta an astronomical twenty-four chapter work with two chapters on mathematics by Brahmagupta (c. 598–c. 668 n. Chr.) we arrive out our goal. Brahmagupta gives a complete set of rules for addition, subtraction, multiplication, and division for positive and negative numbers, as well as for zero as a number. The only difference between his presentation and one that one might find in a modern elementary arithmetic text is that Brahmagupta tried to define division by zero, which as we all learnt in school is not defined, didn’t we? Far from being “hint that maybe you might want a zero in there” this was the real deal. 

HISTSCI_HULK: NildGT would be in serious trouble with the Hindu Nationalist propagators of Hindu science if they found out about his garbage take on the history of Hindu mathematics.

NdGT: These [sic] new mathematics worked their way to the Middle East. Baghdad specifically, a big trading post from all corners of Europe and Asia, and Africa and there it was. Ideas were put across the table. This was the Golden Age of Islam, major advances were made in all…in engineering, in astronomy, in biology, physiology, and vision. The discovery that vision is a passive phenomenon not active. So, all of this is going on and zero was perfected. They called those numerals Hindu numerals; we today call them Arabic numerals. 

What NdGT doesn’t point out is that the Golden Age of Islam lasted from about 700 to 1600 CE and took place in many centres not just in Baghdad. The Brāhmasphuṭasiddhānta was translated into Arabic by Ibrahim ibn Habib ibn Sulayman ibn Samura ibn Jundab al-Fazri (ges. 777 n. Chr.), Muhammad ibn Ibrahim ibn Habib ibn Sulayman ibn Samura ibn Jundab al-Fazri (ges. c. 800 n. Chr.), and Yaʿqūb ibn Ṭāriq (ges. c. 796 n. Chr.) in about 770 CE. This meant that Islamicate[1] mathematical scientists had a fully formed correct theory of zero and negative numbers from this point on. They didn’t develop it, they inherited it. 

Today, people refer to the numerals as Hindu-Arabic numerals!

NdGt: So, this is the full tracking because in the Middle East algebra rose up, the entire arithmetic and algebra rose up invoking zero and you have negative numbers, so mathematics is off to the races. Algebra is one of the very common words in English that has its roots in Arabic. A lot of the a-l words, a-l is ‘the’ in Arabic as I understand it. So, algebra, algorithm, alcohol these are all traceable to that period. … So, I’m saying just consider how late zero came in civilisation. The Egyptian knew nothing of zero [not true, see above]. 

The Persian mathematician Muḥammad ibn Mūsā al-Khwārizmī (c. 780–c. 850) wrote a book on the Hindu numeral system of which no Arabic text is known, but a Latin translation Algoritmi de Numero Indorum was made in the twelfth century. The word algorithm derives from the Latin transliteration Algoritmi of the name al-Khwārizmī. He wrote a second book al-Kitāb al-Mukhtaṣar fī Ḥisāb al-Jabr wal-Muqābalah (c. 82O), the translation of the title is The Compendious Book on Calculation by Completion and Balancing. The term al-Jabr meaning completion or setting together became the English algebra. 

The first time I heard this section I did a double take. “The entire arithmetic and algebra rose up invoking zero and you have negative numbers, so mathematics is off to the races”, you what! Ancient cultures had been doing arithmetic since at least three thousand years BCE and probably much earlier. I can’t do a complete history of algebra in this blog post but by the early second millennium BCE the Babylonians could solve linear equations and had the general solution to quadratic equations but only for positive solutions as they didn’t have a concept of negative numbers. The also could and did solve some cubic equations. In the middle of the first millennium BCE they had astronomical algorithms to predict planetary orbits, as well as lunar and solar eclipses. Brahmagupta’s work includes the general solution of linear equations, and the full general solution of quadratic equations, as we still teach it today. NdGT’s statement is total rubbish.

Of historical interest in the fact that although Islamicate mathematical scientists acquired negative numbers from Brahmagupta, they mostly didn’t use them, regarding them with scepsis 

HISTSCI_HULK: NildGT is off with the fairies

CN: What is this that I hear about the Mayans and zero?

NdGT: I don’t fully know my Mayan history other than that they really worshipped Venus, so their calendar was Venus based. The calendar in ancient Egypt was based on the star Sirius [something unintelligible about new year]. It’s completely arbitrary when you say the new year’s just began. Pick a date whatever matters in your culture and call it new year. Even today when is the Chinese New Year, it’s late January, February. Everybody’s got a different starter date.

The Mayan culture developed a vigesimal, base twenty, place value number system, which included a placeholder zero, independent of the developments in the Middle East and India. The Dresden Codex, one of the most important Maya written documents contains a mixture of astronomy, astrology, and religion, in which observations of Venus play a central role. The first day of Chinese New Year begins on the new moon that appears between 21 January and 20 February

HISTSCI_HULK: I’d worship Venus, she was a very beautiful lady

CN: The Jewish New Year is another new year that…

NdGT: Everybody’s got another new year. The academic calendar’s got a new year that’s September the first…

I assume that NdGT is referring to the US American academic calendar, other countries have different academic years. In Germany where I live, each German state has a different academic year, in order to avoid that the entire population drive off into their summer holidays at the same time. 

NdGT: …and by the way one quick question you’ve got a hundred dollars in your bank account, and you go and withdraw a hundred dollars from the cash machine and the bank tells you what?

[…]

So, here’s the thing, you have no money left in the bank and that’s bad, but what worse is to have negative money in the bank and so this whole concept of negative numbers arose and made complete sense once you pass through zero. Now instead of something coming your way, you now owe it. The mathematics began to mirror commerce and the needs of civilisation, as we move forward, because we are doing much more than just counting. 

CN: So, this is like the birth of modern accounting. Once you find zero that’s when you’re actually able to have a ledger that shows you minuses and pluses and all that kind of stuff.

One doesn’t need negative numbers in order to do accounting. In fact, the most commonly used form of accounting, double entry bookkeeping, doesn’t use negative numbers; credits and debits are both entered with positive numbers. 

Numbers systems and arithmetic mostly have their origin in accounting. The Babylonians developed their mathematics in order to do the states financial accounting. 

HISTSCI_HULK: There’s no accounting for the stupidity in this podcast

NdGT: So now we’re into negatives and this keeps going with math and you find other needs of culture and civilisation, where whole other branches of math have to be developed and we got trigonometry. All those branches of math where you thought the teacher was just being angry with you giving you these assignments, entire branches of math zero started it all. Where it gives you deeper insights into the operations of nature. 

I said I did a double take when NdGT claimed that arithmetic and algebra first took off when the Islamic mathematicians developed zero and negative numbers, which of course they didn’t, but his next claim completely blew my mind. So now we’re into negatives and this keeps going with math and you find other needs of culture and civilisation, where whole other branches of math have to be developed and we got trigonometry. I can hear Hipparchus of Nicaea (c. 190–c. 120) BCE, who is credited with being the first to develop trigonometry revolving violently in his grave.

HISTSCI_HULK: I could recommend some good books on the history of trigonometry, do you think NildGT can read?

There is another aspect to the whole history of zero that NdGT doesn’t touch on, and often gets ignored in other more serious sources. The ancient cultures that didn’t develop a place value number system, didn’t actually need zero. Almost all people in those cultures, who needed to do and did in fact do arithmetical calculations, didn’t do their calculation by writing them out step for step as we all learnt to do in school, they did them using the oldest analogue computer, the abacus or counting board. The counting board was the main means of doing arithmetical calculation from some time a couple of thousand years BCE, we don’t know exactly when, all the way down to the sixteenth century CE. An experienced and skilled user of the counting board could add, subtract, multiply, divide and even extract square roots much faster than you or I could do the same calculations with paper and pencil. 

The lines or column on a counting board represent the ascending powers of ten in a decimal place value number system, powers of sixty on a Babylonian counting board. During a calculation, an empty line or column represents an implicit zero. In fact, there is one speculative theory that realising this led someone to make that zero explicit when writing out the results of a calculation and that is how the zero came into existence. Normally, when using a counting board only the initial problem and the result are recorded in writing and if one is using a stroke collection, ancient Romans and Egyptians, or an alphanumerical, ancient Greeks, as well as ancient Indian and Arabic cultures before they adopted Hindu numerals, number system, then, as already noted above, you don’t need a zero to express any number. 

This blog post is already far too long but before I close a personal statement. I am baffled as to why a supposedly intelligent and highly educated individual such as Neil deGrasse Tyson chooses to pontificate publicly, to a large international audience, on a topic that he very obviously knows very little about, without taking the trouble to actually learn something about the topic before he does so. Maybe the fact that the podcast is heavily sponsored and littered with commercial advertising is the explanation. He’s just doing in for the money.

His doing so is an insult to his listeners, who, thinking he is some sort of expert, believe the half-digested mixture of half-remembered half-facts and made-up rubbish that he spews out. It is also a massive insult to all the historian of mathematics, who spent their lives finding, translating, and analysing the original documents in order to reconstruct the real history. 

HISTSCI_HULK: If I were a teacher and he had handed this in as an essay, I wouldn’t give him an F, I would give it back to him, tell him to burn it, and give him a big fat ZERO!


[1] Islamicate is the preferred adjective used by historians for mathematics and science produced under Islamic hegemony and published mostly in Arabic. It is used to reflect that fact that those producing it were by no means only Arabs or indeed Muslim

7 Comments

Filed under History of Mathematics, Myths of Science

A terrible fortnight for the HISTSCI_HULK

It’s been a tough two weeks for my old buddy the HISTSCI_HULK, who has now packed his bags and departed for pastures unknown screaming, “you can all kiss my posterior!” That not what he actually said but you get the message. 

So, what has upset the #histSTM pedant this time and what was the straw that finally broke the poor monsters back? It all started with Nicolaus Copernicus’ birthday on 19 February. As per usual this year, numerous people, including myself, posted on social media to mark the occasion. Our attention was drawn to the post on Twitter by the Smithsonian National Air and Space Museum, so we followed the link to their website and were less than happy about what we found there:

A rigid code of respect for ancient cultures and thought governed the early Renaissance, a period during which resistance to traditional concepts was met with hostility. Therefore, the Polish astronomer, Nicolaus Copernicus, whose ideas changed the course of astronomy forever, did not release his manuscript for publication until he was on his deathbed.

De revolutionibus Source: Wikimedia Commons

PIFFLE! snorted the HISTSCI_HULK, TOTAL PIFFLE! 

The early Renaissance was a period of lively scientific debate characterised by clashes of contrasting, conflicting, and even contradictory theories, and ideas. The debate in astronomy, to which Copernicus contributed, had been rumbling on since at least the middle of the fifteenth century. Also, it is not true that he “didn’t release his manuscript for publication until he was on his deathbed”. Rheticus published his Narratio Prima, as a trial balloon, in 1540. Following its relatively positive reception, Copernicus gave the manuscript of De revolutionibus to Rheticus to take to Petreius in Nürnberg to be published. At the time, as far as we known, he was still healthy. Printing and publishing a book takes time and by the time the book was finished, Copernicus had suffered a stroke and lay on his deathbed. Finally, the reason why Copernicus held De revolutionibus back for so long was because he couldn’t deliver. In the Commentariolus, Copernicus stated he would prove his hypothesis that the cosmos was heliocentric, but he had failed in this endeavour and so was reluctant to publish, a reluctance that was dissolved by the positive reception of the Narratio Prima.

Looking further on the Smithsonian National Air and Space Museum website, under Ancient Times and the Greeks, we find the following: 

Plato wondered why the starlike planets moved relative to the stars. Trying to answer the question was to occupy the attention of astronomers for many centuries.

Plato was more interested in the how rather than the why. Astronomers sought a mathematical explanation for the celestial movements. 

Under Ptolemy’s Planetary System we find the following

In the theory of Ptolemy, the planets moved in small orbits while revolving in large orbits about the Earth. This theory, although incorrect, could explain the apparent motions of the planets and also account for changes in their brightness.

This is an attempt to explain the deferent–epicycle model of planetary motion that Ptolemaeus presented. If one didn’t already know how Ptolemaeus’ system functioned, one certainly would have no idea after reading this. 

This is what is being described: The basic elements of Ptolemaic astronomy, showing a planet on an epicycle (smaller dashed circle), a deferent (larger dashed circle), the eccentric (×) and an equant (•). Source: Wikimedia Commons

The HISTSCI_HULK, COME ON SMITHSONIAN YOU CAN DO BETTER THAN THIS!

Already more than somewhat miffed the HISTSCI_HULK had the misfortune fourteen days later to view the article posted by the magazine History Today to acknowledge the birthday of Gerard Mercator on 5 March, he flipped out completely, thundering:

WHAT IS THIS HEAP OF ROTTING GARBAGE? WHY DOESN’T SOMEBODY FLUSH IT DOWN THE TOILET WITH ALL THE OTHER EXCREMENT?

Let us examine the offending object, the opening paragraph truly is a stinker:

The age of discovery that began with Christopher Columbus, along with Ferdinand Magellan’s conclusive demonstration that the Earth is round, created a demand for new maps and confronted cartographers with the problem of how to depict the spherical Earth on a flat surface. Of the various solutions, or ‘projections’, the one accepted as the best was that of Gerardus Mercator, which is still in use today. It was also Mercator who first used the term ‘atlas’ for a collection of maps.

In my opinion the age of discovery is an unfortunate misnomer, as I pointed out in a fairly recent blog post on the subject, preferring the term, Contact Period. It didn’t start with Columbus but was well underway by the time he found backing for his first voyage. 

… along with Ferdinand Magellan’s conclusive demonstration that the Earth is round …!!

Where to start? 1) Nobody of significance in Europe need a demonstration that the Earth was round in 1521, it had been an accepted fact for around a thousand years by then. 2) Ferdinand Magellan didn’t demonstrate anything, he died on route on the island of Mactan, waging imperialist war against the indigenous inhabitants. 3) Any nineteenth century flat earther would counter the claim that he “conclusive demonstration that the Earth is round” by stating that he merely sailed in a circle around the flat Earth disc.

… created a demand for new maps and confronted cartographers with the problem of how to depict the spherical Earth on a flat surface.

This statement would have historians of mapmaking and map projection tearing their hair out, that’s if they have any to tear out. The problem of how to project a spherical earth onto a flat surface had been extensively discussed by Ptolemaeus in his Geographia in the second century CE, a book that re-entered Europe at the beginning of fifteenth century more than one hundred years before Magellan undertook his fateful voyage. 

Of the various solutions, or ‘projections’, the one accepted as the best was that of Gerardus Mercator, which is still in use today.

Ignoring for a moment that “accepted as the best” is total rubbish, which of Mercator’s projections? He used at least two different ones and his son a third. Our author is, of course, referring to the so-called Mercator Projection. First off there is no such thing as “the best projection.” All projections have their strengths and weaknesses and, which projection one uses is dependent, or should be, on the task in hand. The Mercator projection allows a mariner to plot a course of constant compass bearing as a straight line on a sea chart. 

Yes, it was Mercator who first used the term atlas for a collection of maps. Our author at least got that right.

The next paragraph is a potted biography, which is OK but is littered with small inaccuracies:

He was born Gerhard Kremer at Rupelmonde in Flanders (now in Belgium), the seventh and last child of an impoverished German family which had recently moved there. His father was a cobbler, but the surname meant ‘merchant’ and Gerhard turned it into Latin as Mercator after his father and mother died when he was in his teens. A great-uncle who was a priest made sure that he got a good education and after graduating from the University of Louvain in 1532 he studied mathematics, geography and astronomy under Gemma Frisius, the Low Countries’ leading figure in these fields. He learned the craft of engraving from a local expert called Gaspar Van der Heyden and the three men worked together in the making of maps, globes and astronomical instruments for wealthy patrons, including the Holy Roman Emperor Charles V.

When Mercator was born his parents were only visiting his uncle or great-uncle, it is not known for certain whether he was the brother or uncle of Mercator’s father, in Rupelmonde. Following his birth, they returned to Gangelt in the Duchy of Jülich. Whether the family was German, or Flemish is not known for certain. They first moved permanently to Rupelmonde when Mercator was six years old. He adopted the Latin name of Mercator, meaning merchant as does Kremer, not when his parents died but when his uncle/great-uncle sent him to a Latin school. In the school he became Gerardus Mercator Rupelmundanus. After graduating MA on the liberal arts faculty of the University of Louvain in 1532, he left the university and only returned two years later, in 1534, to study geography, mathematics, and astronomy under the guidance of Gemma Frisius. He learnt the art of globe making when he assisted Frisius and Gaspar Van der Heyden to construct a terrestrial globe in 1535. This is followed by another paragraph of biography:

In 1538 Mercator produced a map of the world on a projection shaped like a pair of hearts. His inability to accept the Bible’s account of the universe’s creation got him into trouble with the Inquisition in 1544 and he spent some months in prison on suspicion of heresy before being released. John Dee, the English mathematician, astrologer and sage, spent time in Louvain from 1548 and he and Mercator became close friends.

The sentences about the cordiform projection (heart shaped, devised by Johannes Stabius before Magellan “sailed around the world” by the way) world maps and about John Dee are OK.  Why he refers to Dee as an astrologer but not Frisius or Mercator, who were both practicing astrologers, puzzles me. I’m also not sure why he calls Dee a sage, or what it’s supposed to mean. However, his account of Mercator’s arrest on suspicion of heresy is simply bizarre. He was arrested in 1543 on suspicion of being a Lutheran. Whilst in prison he was accused of suspicious correspondence with the Franciscan friars of Mechelen. No evidence was found to support either accusation, and he was released after four months without being charged. Nothing to do with, “His inability to accept the Bible’s account of the universe’s creation.”

We are now on the home straight with the final paragraph. Mostly harmless biography but it contains a real humdinger!

In 1552 Mercator moved to Duisburg in the Duchy of Cleves in Germany, where he enjoyed the favour of the duke. He set up a cartographic workshop there with his staff of engravers and perfected the Mercator projection, which he used in the map of the world he created in 1569. It employed straight lines spaced in a way that provided an accurate ratio of latitude and longitude at any point and proved a boon to sailors, though he never spent a day at sea himself [my emphasis]. In the 1580s he began publishing his atlas, named after the giant holding the world on his shoulders in Greek mythology, who was now identified with a mythical astronomer-king of ancient times. Strokes in the early 1590s partly paralysed Mercator and left him almost blind. A final one carried him off in 1594 at the age of 82 and he was buried in the Salvatorkirche in Duisburg.

I studied mathematics at university and have been studying/teaching myself the history of map projections for maybe the last thirty years and I have absolutely no idea what the phrase, straight lines spaced in a way that provided an accurate ratio of latitude and longitude at any point, is supposed to mean. I’m certain the author, when he wrote it, didn’t have the faintest clue what he was saying and tried to bluff. I also pity any reader who tries to make any sense out of it. It’s balderdash, hogwash, gobbledygook, poppycock, mumbo-jumbo, gibberish, baloney, claptrap, prattle, or just plain total-fucking-nonsense! What it definitively isn’t, in anyway whatsoever, is a description of the Mercator projection.

This wonderful piece of bullshit caused the HISTSCI_HULK to flip out completely. Imitating Charles Atlas, he tore his facsimile edition of the Mercator-Hondius Atlas in half with his bare hands and threw it out of the window. It’s a hard back by the way.

The term Atlas, as used by Mercator had nothing to do with the mythological Greek Titan Atlas, who by the way, holds the cosmos on his shoulders and not the Earth, but rather bizarrely the equally mythical King Atlas of Mauritania, who according to legend was a wise philosopher, mathematician, and astronomer, who is credited with having produced the first celestial globe. As Mercator explains: “I have set this man Atlas, so notable for his erudition, humaneness, and wisdom as a model for my imitation.”

Bizarrely, the article is illustrated, not by Mercator’s 1569 world map based on his projection, but the double planisphere world map from 1587 created by his son Rumold Mercator (1541–1599), which was based on it, and which first appeared in Isaac Casaubon’s edition of Strabo’s Geographia, published in Geneva. It was incorporated into later editions of the Atlas. 

Source: Wikimedia Commons

History Today is a popular English monthly history magazine, which according to Wikipedia, and I quote, “presents serious and authoritative history to as wide a public as possible.” Judging by this article, you could have fooled me. History Today has more than 300,000 followers on Twitter, that’s more than 300,000 potential readers for this garbage. The article was written by Richard Cavendish (1930–2016), an Oxford graduate, who specialised in medieval studies. Most well known as a historian of the occult his work, quoting Wikipedia once more, “is highly regarded for its depth of research and agnostic stance towards its sometimes controversial subject matter,” and, “He also wrote regularly for the British journal History Today.” The article was written in 2012, but the editor, Paul Lay, who considered it “serious and authoritative history” then, is the same editor, who thought it suitable to trot out again in 2022. 

Having within a fortnight been confronted by two horrible examples of how not to write popular #histSTM, the HISTSCI_HULK was more than somewhat mentally fragile when he stumbled on the offending object that finally caused him to snap, pack his bag, and depart, vowing never to read another word ever again. The offending object? A page from the book of the four-year-old daughter of a historian, who I know on Twitter:

THAT’S BLEEDIN’ INDOCTRINATION, THAT IS, SCREECHED THE HISTSCI_HULK AS HE SLAMMED THE DOOR SHUT ON HIS WAY OUT

“He made an amazing discovery.” As we obviously have to do with Galileo’s telescopic discoveries, there were more than one, we will restrict ourselves to those. All of Galileo’s telescopic discoveries were made independently, in the same time period, by other astronomers and they were also confirmed by the Jesuit astronomers of the Collegio Romano, so in fact anybody, who had anything to say on the topic, not only believed him but also congratulated him for having made them. 

“Galileo changed how people think about the Sun and Earth.” If any single person is going to be given credit for that then surely it should be Copernicus. In fact, it is, in my opinion, wrong to credit any single person with this. The shift in perception from a geocentric cosmos to a heliocentric one was a gradual accumulative process to which a fairly number of people contributed.

“He built a new telescope to study space.” I have difficulties with the new in this sentence. Galileo, like quite a large number of people built a so-called Dutch telescope with which to make astronomical observations. He was by no means unique in doing this and not even the first to do so. What should be expressed here is that Galileo was one of a number of people, who constructed telescopes, after it was invented in 1608, in order to make astronomical observations.

“He proved that Earth travels around the Sun.” Apart from the fact that the sentence isn’t even grammatically correct, it should read “the Earth”, it’s simple false. The problem that faced all the early supporters of a heliocentric model of the cosmos was that they simply couldn’t prove the hypothesis.

“People thought it was the other way around.” Of course, they did because that’s what our senses tell us. We all have to learn that it’s not true!

I have a very simple question. Why do people present young, impressionable children with garbage like this?

In case anybody is concerned, I’m sure the HISTSCI_HULK will return when he’s calmed down.  

6 Comments

Filed under History of Astronomy, History of Cartography, Myths of Science

WRONG, WRONG, WRONG…

I think the Internet has finally broken the HISTSCI_HULK; he’s lying in the corner sobbing bitterly and mumbling wrong, wrong, wrong… like a broken record. What could have felled the mighty beast? 

29 January was the anniversary of the birth (1611) and death (1687) of the Danzig astronomer Johannes Hevelius and numerous people, including myself, posted or reposted articles about him on the Internet. One of those articles was the 2018 article, The 17th-Century Astronomer Who Made the First Atlas of the Moon by Elizabeth Landau, with the lede Johannes Hevelius drew some of the first maps of the moon, praised for their detail, from his homemade rooftop observatory in the Kingdom of Poland, in the Smithsonian Magazine.

Johannes Hevelius by Daniel Schultz Source: Wikimedia Commons

I suppose that I’m really to blame because I let him read it. He was chugging along quite happily, nodding his head, and burbling to himself, on the lookout, as always, for history of science errors and howlers, when he let out a piercing scream, NOOOOOO!!!!!! And collapsed in a sobbing heap on the floor. I’ve tried everything but I haven’t been able to console the poor beast.

So, what was it that caused this total breakdown? The first six paragraphs of the article are harmless enough, with only some very minor questionable statements, not really worth bothering about, but then comes this monstrosity:

Mapping the moon was one of Hevelius’s first major undertakings. The seafaring nations at the time were desperately searching for a way to measure longitude at sea, and it was thought that the moon could provide a solution. The idea was that during a lunar eclipse, if sailors observed the shadow of the moon crossing a particular point on the surface at 3:03 p.m., but they knew that in another location, such as Paris, the same crossing would occur at 3:33 p.m., then they could calculate their degrees of longitude away from the known location of the city. More accurate lunar charts, however, would be required for the technique to be possible (and due to the practical matters of using a large telescope on a rolling ship, a truly reliable way to calculate longitude at sea would not be achieved until the invention of the marine chronometer).

One can only assume that it is an attempt to describe the lunar distance method for determining longitude but apart from the word moon, it has absolutely nothing in common with the actual lunar distance method. Put very mildly it is a complete travesty that should never have seen the light of day, let alone been published. 

Lunar eclipses had already been used for many centuries to determine the longitude difference between two locations, but you don’t need either a map of the moon or a telescope to do so. Two observers, in their respective locations, merely record the local time of the beginning and/or the end of the eclipse (initial and final contacts) and the resulting time difference gives the difference in longitude. Lunar eclipses are impractical as a method of determining longitude for navigation, as they occur too infrequently; there will only be a total of 230 lunar eclipses in the whole of the twenty-first century, of which only eighty-five will be total lunar eclipses. For example, if you were sitting in the middle of the Atlantic Ocean on 6 June 2022 and wished to determine your longitude, you would have to wait until 8 November for the next total lunar eclipse. After that you would have to wait until 14 March 2025 for the next total lunar eclipse, although there are a couple of partial and penumbral eclipses in between. 

Early Modern explorers did use solar and lunar eclipses combined with an ephemeris, a book of astronomical tables, to determine longitude on land, to establish their location and to draw maps. Columbus, famously, used his knowledge of the total lunar eclipse on 1 March 1504, taken from an ephemeris, to intimidate the natives on the island of Jamaica into continuing to feed his hungry stranded crew.

The lunar distance method of determining longitude is something completely different. It was first proposed by the Nürnberger mathematicus, Johannes Werner (1468–1522) in his Latin translation of Ptolemaeus’ GeographiaIn Hoc Opere Haec Continentur Nova Translatio Primi Libri Geographicae Cl Ptolomaei, published in Nürnberg in 1514 and then discussed by Peter Apian (1495–1552) in his Cosmographicus liber, published in Landshut in 1524. For reasons that I will explain in a minute, it was found impractical, but was proposed again in 1634 by the French astronomer Jean-Baptiste Morin (1586–1656), but once again rejected as impractical. 

The lunar distance method relies on determining the position of the Moon relative to a given set of reference stars, a unique constellation for every part of the Moon’s orbit. Then using a set of tables to determine the timing of a given constellation for a given fixed point. Having determined one’s local time, it is then possible to calculate the time difference and thus the longitude. Because it is pulled hither and thither by both the Sun and the Earth the Moon’s orbit is extremely erratic and not the smooth ellipse suggested by Kepler’s three laws of planetary motion. This led to the realisation that compiling the tables to the necessary accuracy was beyond the capabilities of those sixteenth century astronomers and their comparatively primitive instruments, hence the method had not been realised. 

We now turn our attention to Landau’s closing statement in this horror paragraph:

More accurate lunar charts, however, would be required for the technique to be possible (and due to the practical matters of using a large telescope on a rolling ship, a truly reliable way to calculate longitude at sea would not be achieved until the invention of the marine chronometer).

Historically, tables of the necessary accuracy were produced by Tobias Meyer (1723–1762) in 1755. However, the calculations necessary to determine longitude having measured the lunar distance proved to be too complex and too time consuming for seamen and so Neville Maskelyne (1732–1811) produced the Nautical Almanac containing the results pre-calculated in the form of tables and published for the first time in 1766. One does not need a telescope to make the necessary observations. A sextant is sufficient to measure the distance between the moon and the reference stars and that had been invented by John Hadley (1682–1744) in 1731. The lunar distance method was in fact ready for practical use before the marine chronometer. 

One question that I have, is did Landau extract this heap of nonsense out of her own posterior or is she paraphrasing somebody else’s description? Throughout her article she gives links to various books with the information she is using, so did she take this abomination from another source? If so, it is still out there somewhere creating confusion for anybody unlucky enough to read it. On the question of sources, Dava Sobel’s Longitude, which, despite her prejudices against it, contains a correct description of the lunar distance method was published in 2005 and the much better Finding Longitude by Rebekah Higgitt and Richard Dunn was published in 2014, so there is no real excuse for Landau’s load of bovine manure in 2018. 

I don’t know how many people have subscriptions to the Smithsonian Magazine, but it has over 300,000 followers on Twitter. If we look at the Wikipedia article on the Smithsonian Institutions it starts thus, “The Smithsonian Institution, or simply, the Smithsonian, is a group of museums and education and research centers, the largest such complex in the world, created by the U.S. government for the increase and diffusion of knowledge (my emphasis), so why is the Smithsonian Magazine diffusing crap?

I’m hoping that with plenty of sweet tea and digestive biscuits, I’ll be able to restore the HISTSCI_HULK to his normal boisterous self. 

10 Comments

Filed under History of Astronomy, History of Navigation, Myths of Science

STOMP. STOMP, STOMP … KEPLER DID WOT!

I really shouldn’t but the HISTSCI_HULK is twisting my arm and muttering dark threats, so here goes. A week ago, we took apart Vedang Sati’s post 10 Discoveries By Newton That Changed The World. When I copied it to my blog, I removed the links that Sati had built into his post. I then made the mistake of following his link to his post on Kepler, so here we go again. 

Johannes Kepler Source: Wikimedia Commons

7 Ways In Which Johannes Kepler Changed Astronomy

Johannes Kepler was a German astronomer who discovered the three laws of planetary motion. Apart from his contributions to astronomy, he is also known to have pioneered the field of optics. In this post, let’s read some amazing facts about Kepler and his work. 

He obviously doesn’t rate Kepler as highly as he rates Newton, so the introduction is less hagiographic this time. However, it does contain one quite extraordinary claim, when he writes, “he is also known to have pioneered the field of optics.” Optics as a scientific discipline was pioneered by Euclid, who lived in the fourth century BCE, so about two thousand years before Kepler. There were also quite a few people active in the field in the two millennia in between.

Early Affliction

He suffered from small pox at a very early age. The disease left him with weak eyesight. Isn’t  it wonderful then how he went on to invent eyeglasses for near-eye and far-eye sightedness.

Kepler did indeed suffer from smallpox sometime around the age of four, which almost cost him his life and did indeed leave him with damaged eyesight. However, Kepler did not invent spectacles of any type whatsoever. The first spectacles for presbyopia, far-sightedness occurring in old age, began to appear in the last decades of the thirteenth century CE. Spectacles for myopia, short-sightedness, were widely available by the early fifteenth century. What Kepler actually did was to publish the first scientific explanation of how lenses function to correct defects in eyesight in his Astronomiae Pars Optica (The Optical Part of Astronomy), in 1604. Francesco Maurolico (1494–1574) actually gave the correct explanation earlier than Kepler in his Photismi de lumine et umbra but this was only published posthumously in 1611, so the credit for priority goes to Kepler

Astronomiae Pars Optica Source: Wikimedia Commons

Introduction to Astronomy

Kepler’s childhood was worsened by his family’s financial troubles. At the age of 6, Johannes had to drop out of school so to earn money for the family. He worked as a waiter in an inn.

As Kepler first entered school at the age of seven, it would have been difficult for his schooling to have been interrupted when he was six. His primary schooling was in fact often interrupted both by illness and the financial fortunes of the family. 

In the same year, his mother took him out at night to show him the Great Comet of 1577 which aroused his life-long interest in science and astronomy. 

Yes, she did!

Copernican Supporter

At a time when everyone was against the heliocentric model of the universe, Kepler became its outspoken supporter. He was the first person to defend the Copernican theory from a scientific and a religious perspective.

Not everyone was opposed to the heliocentric model of the universe, just the majority. Poor old Georg Joachim Rheticus (1514–1574), as the professor of mathematics, who persuaded Copernicus to publish De revolutionibus, he would be deeply insulted by the claim that Kepler was the “first person to defend the Copernican theory from a scientific and a religious perspective.” Rheticus, of course, did both, long before Kepler was even born, although his religious defence remained unpublished and was only rediscovered in the twentieth century. Rheticus was not the only supporter of Copernicus, who preceded Kepler there were others, most notably, in this case, Michael Mästlin (1550–1631), who taught Kepler the Copernican heliocentrism. 

Contemporary of Galileo

Galileo was not a great supporter of Kepler’s work especially when Kepler had proposed that the Moon had an influence over the water (tides). It would take an understanding many decades later which would prove Kepler correct and Galileo wrong.

It is indeed very true that Galileo rejected Kepler’s theory of the tides, when promoting his own highly defective theory, but that is mild compared to his conscious ignoring of Kepler’s laws of planetary motion, which were at the time the most significant evidence for a heliocentric cosmos.

Pioneer of Optics

Kepler made ground-breaking contributions to optics including the formulation of inverse-square law governing the intensity of light; inventing an improved refracting telescope; and correctly explaining the function of the human eye.

Kepler’s contributions to the science of optics were indeed highly significant and represent a major turning point in the development of the discipline. His Astronomiae Pars Optica does indeed contain the inverse square law of light intensity and the first statement that the image is created in the eye on the retina and not in the crystal lens.

However, that he invented an improved telescope is more than a little problematic. When Galileo published his Sidereus Nuncius in 1610, the first published account of astronomical, telescopic discoveries, there was no explanation how a telescope actually functions, so people were justifiably sceptical. Having written the book on how lenses function with his Astronomiae Pars Optica in 1604, Kepler now delivered a scientific explanation how the telescope functioned with his Dioptrice in 1611. 

Kepler Dioptrice Source: Wikimedia Commons

This contained not just a theoretical explanation of the optics of a Dutch or Galilean telescope, with a convex objective and a concave eyepiece, but also of a telescope with convex objective and convex eyepiece, which produces an inverted image, now known as a Keplerian or astronomical telescope, also one with three convex lenses, the third lens to right the inverted image, now known as a field telescope, and lastly, difficult to believe, the telephoto lens. Kepler’s work remained strictly theoretical, and he never constructed any of these telescopes, so is he really the inventor? The first astronomical telescope was constructed by Christoph Grienberger (1561–1636) for Christoph Scheiner (c. 1573–1650) as his heliotropic telescope, for his sunspot studies. 

Heliotropic telescope on the left. On the right Scheiner’s acknowledgement that Grienberger was the inventor

Is the astronomical telescope an improved telescope, in comparison with the Dutch telescope? It is very much a question of horses for courses. If you go to the theatre or the opera then your opera glasses, actually a Dutch telescope, will be much more help in distinguishing the figure on the stage than an astronomical telescope. Naturally, the astronomical telescope, with its wider fields of vision, is, as its name implies, much better for astronomical observations than the Dutch telescope once you get past the problem of the inverted image. This problem was solved with the invention of the multiple lens eyepiece by Anton Maria Schyrleus de Rheita (1604–1660), announced in Oculus Enoch et Eliae published in 1645, although he had already been manufacturing them together with Johann Wiesel (1583–1662) since 1643.

Helped Newton

His planetary laws went on to help Sir Isaac Newton derive the inverse square law of gravity. Newton had famously acknowledged Kepler’s role, in a quote: “If I have seen further, it is by standing on the shoulders of giant(s).

Sati is not alone in failing to give credit to Kepler for his laws of planetary motion in their own right, but instead regarding them merely as a stepping-stone for Newton and the law of gravity. Kepler’s laws of planetary motion, in particular his third law, are the most significant evidence for a heliocentric model of the cosmos between the publication of De revolutionibus in 1543 and Principia in 1687 and deserve to be acknowledged and honoured in their own right! 

Newton’s famous quote, actually a much-used phrase in one form or another in the Early Modern period, originated with Bernard of Chartres (died after 1124) in the twelfth century. Newton used it in a letter to Robert Hooke on 5 February 1675, so twelve years before the publication of his Principia and definitively not referencing Kepler:

What Des-Cartes [sic] did was a good step. You have added much several ways, & especially in taking the colours of thin plates into philosophical consideration. If I have seen further it is by standing on the sholders [sic] of Giants.

Kepler’s Legacy

There is a mountain range in New Zealand named after the famous astronomer. A crater on the Moon is called Kepler’s crater. NASA paid tribute to the scientist by naming their exo-planet telescope, Kepler.

Given the vast number of things named after Kepler, particularly in Germany, Sati’s selection is rather bizarre, in particular because it is a mountain hiking trail in New Zealand that is named after Kepler and not the mountain range itself.

Once again, we are confronted with a collection of half facts and straight falsehoods on this website, whose author, as I stated last time has nearly 190,000 followers on Facebook. 

Me: I told you that we couldn’t stop the tide coming in

HS_H: You’re not trying hard enough. You’ve gotta really STOMP EM!

Me: #histsigh

5 Comments

Filed under History of Astronomy, History of Optics, Myths of Science

STOMP, STOMP, STOMP … NEWTON DID WOT!

Oh dear! The HISTSCI_HULK has been woken from his post festive slumbers and is once again on the rampage. What has provoked this outbreak so early in the new year? He chanced to see a post, that one of my followers on Facebook had linked to, celebrating Newton’s new-style birthday on 4 January. As is well-known, we here at the Renaissance Mathematicus celebrate Newton’s old-style birthday, but that’s another story. 

The post is on a website called Wonders of Physics, is the work of an Indian physicist, Vedang Sati, and is titled:

10 Discoveries By Newton That Changed The World

I have reproduced the whole horror show below. Let us examine it.

Isaac Newton is one of the few names that will forever be enshrined in physics history and that too with a lot of glamour associated. Contributions of none other physicist match, his, well, Einstein’s, or not even his!? The following are Newton’s ten most well-known works that changed the world later on. 

A strong hagiographical vibe going down here, which doesn’t bode well.

Laws of motion

1. An object will remain at rest or move in a straight line unless acted upon by an external force.

2. F=ma.

3. For every action, there is an equal and opposite reaction. 

Newton’s three laws of motion, along with thermodynamics, stimulated the industrial revolution of the 18th and 19th centuries. Much of the society built today owes to these laws.

Remember these are supposedly the things that Newton discovered. His first law of motion, the law of inertia, was first formulated by Galileo, who, however, thought it only applied to circular motion. For linear motion it was first formulated by Isaac Beeckman and taken over from him by both René Descartes and Pierre Gassendi. Newton took it from Descartes. The second law, which was actually slightly different in the original form in which Newton used it, was taken from Christiaan Huygens. The third law was probably developed out of the studies of elastic and inelastic collision, which again originates by Descartes, who got much wrong which was corrected by both Huygens and Newton. Newton’s contribution was to combine them as axioms from which to deduce his mechanics, again probably inspired by Huygens. He tried out various combinations of a range of laws before settling on these three. Sati’s following statement is quite frankly bizarre, whilst not totally false. What about the Principia, where they occur, as the foundation of classical mechanics and perhaps more importantly celestial mechanics.

Binomial Theorem

Around 1665, Isaac Newton discovered the Binomial Theorem, a method to expand the powers of sum of two terms. He generalized the same in 1676. The binomial theorem is used in probability theory and in the computing sciences.

The binomial theorem has a very long history stretching back a couple of thousand years before Newton was born. The famous presentation of the binomial coefficients, known as Pascal’s Triangle, which we all learnt in school (didn’t we?), was known both to Indian and Chinese mathematicians in the Middle Ages. Newton contribution was to expand the binomial theorm to the so-called general form, valid for any rational exponent. 

Inverse square law

By using Kepler’s laws of planetary motion, Newton derived the inverse square law of gravity. This means that the force of gravity between two objects is inversely proportional to the square of the distance between their centers. This law is used to launch satellites into space.

I covered this so many times, it’s getting boring. Let’s just say the inverse square law of gravity was derived/hypothesized by quite a few people in the seventeenth century, of whom Newton was one. His achievement was to show that the inverse square law of gravity and Kepler’s third law of planetary motion are mathematically equivalent, which as the latter in derived empirically means that the former is true. Newton didn’t discover the inverse square law of gravity he proved it.

Newton’s cannon

Newton was a strong supporter of Copernican Heliocentrism. This was a thought experiment by Newton to illustrate orbit or revolution of moon around earth (and hence, earth around the Sun)

He imagined a very tall mountain at the top of the world on which a cannon is loaded. If too much gunpowder is used, then the cannon ball will fly into space. If too little is used, then the ball wouldn’t travel far. Just the right amount of powder will make the ball orbit the Earth. 

This thought experiment was in Newton’s De mundi systemate, a manuscript that was an originally more popular draft of what became the third book of the Principia. The rewritten and expanded published version was considerably more technical and mathematical. Of course, it has nothing to do with gunpowder, but with velocities and forces. Newton is asking when do the inertial force and the force of gravity balance out, leading to the projectile going into orbit. It has nothing to do directly with heliocentricity, as it would equally apply to a geocentric model, as indeed the Moon’s orbit around the Earth is. De mundi systemate was first published in Latin and in an English translation, entitled A Treatise of the System of the World posthumously in 1728, so fifty years after the Principia, making it at best an object of curiosity and not in any way world changing. 

Calculus

Newton invented the differential calculus when he was trying to figure out the problem of accelerating body. Whereas Leibniz is best-known for the creation of integral calculus. The calculus is at the foundation of higher level mathematics. Calculus is used in physics and engineering, such as to improve the architecture of buildings and bridges.

This really hurts. Newton and Leibniz both collated and codified systems of calculus that included both differential and integral calculus. Neither of them invented it. Both of them built on a two-thousand-year development of the discipline, which I have sketch in a blog post here. On the applications of calculus, I recommend Steven Strogatz’s “Infinite Powers”

Rainbow

Newton was the first to understand the formation of rainbow. He also figured out that white light was a combination of 7 colors. This he demonstrated by using a disc, which is painted in the colors, fixed on an axis. When rotated, the colors mix, leading to a whitish hue.

In the fourteenth century both the German Theodoric of Freiberg and the Persian Kamal al-Din al-Farsi gave correct theoretical explanations of the rainbow, independently of one another. They deliver an interesting example of multiple discovery, and that scientific discoveries can get lost and have to be made again. In the seventeenth century the correct explanation was rediscovered by Marco Antonio de Dominis, whose explanation of the secondary rainbow was not quite right. A fully correct explanation was then delivered by René Descartes. 

That white light is in fact a mixture of the colours of the spectrum was indeed a genuine Newton discovery, made with a long series of experiments using prisms and then demonstrated the same way. Newton’s paper on his experiments was his first significant publication and, although hotly contested, established his reputation. It was indeed Newton, who first named seven colours in the spectrum, there are in fact infinitely many, which had to do with his arcane theories on harmony. As far as can be ascertained the Newton Disc was first demonstrated by Pieter van Musschenbroek in 1762. 

Reflecting Telescope

In 1666, Newton imagined a telescope with mirrors which he finished making two years later in 1668. It has many advantages over refracting telescope such as clearer image, cheap cost, etc.

Once again, the reflecting telescope has a long and complicated history and Newton was by no means the first to try and construct one. However, he was the first to succeed in constructing one that worked. I have an article that explains that history here.

Law of cooling

His law states that the rate of heat loss in a body is proportional to the difference in the temperatures between the body and its surroundings. The more the difference, the sooner the cup of tea will cool down.

Whilst historically interesting, Newton’s law of cooling holds only for very small temperature differences. It didn’t change the world

Classification of cubics

Newton found 72 of the 78 “species” of cubic curves and categorized them into four types. In 1717, Scottish mathematician James Stirling proved that every cubic was one of these four types.

Of all the vast amount of mathematics that Newton produced, and mostly didn’t publish, to choose his classification of cubics as one of his 10 discoveries that changed the world is beyond bizarre. 

Alchemy

At that time, alchemy was the equivalent of chemistry. Newton was very interested in this field apart from his works in physics. He conducted many experiments in chemistry and made notes on creating a philosopher’s stone.

Newton could not succeed in this attempt but he did manage to invent many types of alloys including a purple copper alloy and a fusible alloy (Bi, Pb, Sn). The alloy has medical applications (radiotherapy).

Here we have a classic example of the Newton was really doing chemistry defence, although he does admit that Newton made notes on creating a philosopher’s stone. If one is going to call any of his alloys, world changing, then surely it should be speculum, an alloy of copper and tin with a dash of arsenic, which Newton created to make the mirror for his reflecting telescope, and which was used by others for this purpose for the next couple of centuries.

Of course, the whole concept of a greatest discovery hit list for any scientist is totally grotesque and can only lead to misconceptions about how science actually develops. However, if one is going to be stupid enough to produce one, then one should at least get one’s facts rights. Even worse is that things like the classification of the cubics or Newton’s Law of Cooling are anything but greatest discoveries and in no way “changed the world.” 

You might wonder why I take the trouble to criticise this website, but the author has nearly 190,000 followers on Facebook and he is by no means the only popular peddler of crap in place of real history of science on the Internet. I often get the feeling that I and my buddy the HISTSCI_HULK are a latter-day King Cnut trying to stem the tide of #histSTM bullshit. 

3 Comments

Filed under History of science, Myths of Science, Newton

We plumb the depths of boundless history of science stupidity 

Late on Friday evening, Renaissance mathematicus friend and star historian of medieval science, Seb Falk, posted a couple of paragraphs from an Oberserver newspaper interview with the physicist and self-appointed science communicator Michio Kaku, from April this year. The history of science content of those paragraphs was so utterly, mindbogglingly ludicrous that it had me tossing and turning all night and woke from his deep winter sleep the HISTSCI_HULK, who is now raging through my humble abode like a demented behemoth on speed. What was it that set the living history of science bullshit detector in such a state of apoplexy? I offer up the evidence:

How much, do you think, would Isaac Newton understand of your book?
I think he would appreciate it. In 1666 we had the great plague. Cambridge University was shut down and a 23-year-old boy was sent home, and he saw an apple fall on his estate. And then he realised that the laws that control an apple are the same laws that control the moon. So the epidemic gave Isaac Newton an opportunity to sit down and follow the mathematics of falling apples and falling moons. But of course there was no mathematics at that time. He couldn’t solve the problem so he created his own mathematics. That’s what we are doing now. We, too, are being hit by the plague. We, too, are confined to our desks. And we, too, are creating new mathematics.

This paragraph is, of course, the tired old myth of Newton’s Annus mirabilis, which got continually recycled in the early months of the current pandemic and, which I demolished in a blog post back in April 2020, so I won’t bore you with a rehash here. However, Kaku has managed to add a dimension of utter mind shattering ignorance

But of course there was no mathematics at that time. He couldn’t solve the problem so he created his own mathematics.

Just limiting myself to the Early Modern Period, Tartaglia, Cardano, Ferrari, Bombelli, Stiefel, Viète, Harriot, Napier, Kepler, Galileo, Cavalieri, Fermat, Descartes, Pascal, Gregory, Barrow, Wallis and many others are all not just turning in their graves, but spinning at high speed, whilst screaming WHAT THE FUCK! at 140 decibels.

The real irony is that not only did Newton not codify the calculus during his non-existent Annus mirabilis–he didn’t create it, it evolved over a period of approximately two thousand years–but when he wrote his Principia twenty years later, he used a modernised version of Euclidian geometry, which was created some two thousand years earlier, and not the calculus!  

There is more to come:

There are many brilliant scientists whose contributions you discuss in the book. Which one, for you, stands out above the rest?
Newton is at number one, because, almost out of nothing, out of an era of witchcraft and sorcery, he comes up with the mathematics of the universe, he comes up with a theory of almost everything. That’s incredible. Einstein piggybacked on Newton, using the calculus of Newton to work out the dynamics of curved spacetime and general relativity. They are like supernovas, blindingly brilliant and illuminating the entire landscape and changing human destiny. Newton’s laws of motion set into motion the foundation for the Industrial Revolution. A person like that comes along once every several centuries.

Where to start? To describe the late seventeenth and early nineteenth centuries as “an era of witchcraft and sorcery” is simply bizarre. This is the highpoint of the so-called Scientific Revolution, it is the Augustan age of literature that in Britain alone produced Swift, Pope, Defoe, and many others, it is the age of William Hogarth, it is the age in which modern capitalism was born and, and, and… Yes, some people still believed in witchcraft and sorcery, some still do today, but it was by no means a central factor of the social, political, or cultural life of the period. This was the dawn of the Enlightenment, for fuck’s sake, the period of Spinosa, Locke, Hume and, once again, many others. 

The “Newton is at number one, because, almost out of nothing” produces howls of protest echoing down the centuries from Kepler, Stevin, Galileo, Torricelli, Descartes, Pascal, Huygens et al

With respect to Steven Strogatz, I will grant him his hyperbolic “mathematics of the universe”, but Newton’s physics covers just a very small area of the entire world of knowledge and is in no way a “theory of almost everything.” 

I should leave the comments on Einstein, to those better qualified to condemn them than I. However, I find the claim that “Einstein piggybacked on Newton” simply grotesque. Also, the calculus that Newton and Leibniz codified, which became the mathematics of Newtonian physics, although Newton himself did not use it, is a very different beast to the tensor calculus used in the general relativity theory. In fact, the only thing they have in common is the word calculus, I would expect someone with a doctorate in physics to know that.

One is tempted to ask if the Guardian has fired all of its science editors and replaced them with failed door to door vacuum cleaner salesmen. It’s the only rational explanation as to why the science pages of the Observer were adorned with such unfathomably dumb history of science. It is supposed to be a quality newspaper!

The HISTSCI_HULK has in the meantime thrown himself off the balcony into the snowstorm and was last seen stomping off into the woods muttering, The horror! The horror!

10 Comments

Filed under History of science, Myths of Science

The Renaissance Mathematicus tries his luck as YouTube Influencer

Some time back I had a late-night chat with medieval historian Tim O’Neill about all things Galileo Galilei; late night for me that is, early morning for him. Unbeknown to me the sneaky Aussie bugger recorded my ruminations on the Tuscan mathematicus; they’re like that those antipodeans, duplicitous. Now he’s gone and posted the whole affair on YouTube, for all the world to see.

 I may have to have plastic surgery and move to an unknown destination in South America.

However, if you have a strong stomach and like to watch train wrecks or are just curious what the Renaissance Mathematicus looks like in real life, then you can find the whole horrible mess on Tim’s History for Atheists YouTube channel in three obscenely long parts:

The Galileo Affair Part 1 

The Galileo Affair Part 2 

The Galileo Affair Part 3  

 Who knows, if enough people can be fooled into watching it, I might become the next Paris Hilton! 

WARNING: Not suitable for children or viewers with high moral standards: Expletives not deleted!

3 Comments

Filed under History of Astronomy, Myths of Science, Renaissance Science