Category Archives: History of Physics

Preach truth – serve up myths.

Over Christmas I poked a bit of fun at Neil deGrasse Tyson for tweeting that Newton would transform the world by the age of 30, pointing out he was going on forty-five when he published his world transforming work the Principia. The following day NdGT posted a short piece on Face Book praising his own tweet and its success. Here he justified his by the age of thirty claim but in doing so rode himself deeper into the mire of sloppy #histsci. You might ask why this matters, to which the answer is very simple. NdGT is immensely popular especially amongst those with little idea of science and less of the history of science and who hang on his every utterance. Numerous historians of science labour very hard to dismantle the myths of science and to replace them with a reasonable picture of how science evolved throughout its long and convoluted history. NdGT disdains those efforts and perpetuates the myths leading his hordes of admirers up the garden path of delusion. Let us take a brief look at his latest propagation of #histmyth.

NdGT’s post starts off with the news that his Newton birthday tweet is the most RTed tweet he has every posted citing numbers that lesser mortals would not even dare to dream about. This of course just emphasises the danger of NdGT as disseminator of false history of science, his reach is wide and his influence is strong. Apparently some Christians had objected to NdGT celebrating Newton’s birthday on Christ’s birthday and NdGT denies that his tweet was intended to be anti-Christian but then goes on to quote the tweet that he sent out in answer to those accusations:

“Imagine a world in which we are all enlightened by objective truths rather than offended by them.”

Now on the whole I agree with the sentiment expressed in this tweet, although I do have vague vision of Orwellian dystopia when people from the scientism/gnu atheist camp start preaching about ‘objective truth’. Doesn’t Pravda mean truth? However I digress.

I find it increasing strange that NdGT’s craving for objective truth doesn’t stretch to the history of science where he seems to much prefer juicy myths to any form of objectivity. And so also in this case. In his post he expands on the tweet I had previously poked fun at. He writes:

Everybody knows that Christians celebrate the birth of Jesus on December 25th.  I think fewer people know that Isaac Newton shares the same birthday.  Christmas day in England – 1642.  And perhaps even fewer people know that before he turned 30, Newton had discovered the laws of motion, the universal law of gravitation, and invented integral and differential calculus.  All of which served as the mechanistic foundation for the industrial revolution of the 18th and 19th centuries that would forever transform the world.

What we are being served up here is a slightly milder version of the ‘annus mirabilis’ myth. This very widespread myth claims that Newton did all of the things NdGT lists above in one miraculous year, 1666, whilst abiding his time at home in Woolsthorpe, because the University of Cambridge had been closed down due to an outbreak of the plague. NdGT allows Newton a little more time, he turned 30 in 1672, but the principle is the same, look oh yee of little brain and tremble in awe at the mighty immaculate God of science Sir Isaac Newton! What NdGT the purported lover of objective truth chooses to ignore, or perhaps he really is ignorant of the facts, is that a generation of some of the best historians of science who have ever lived, Richard S. Westfall, D. T. Whiteside, Frank Manuel, I. Bernard Cohen, Betty Jo Teeter Dobbs and others, have very carefully researched and studied the vast convolute of Newton’s papers and have clearly shown that the whole story is a myth. To be a little bit fair to NdGT the myth was first put in the world by Newton himself in order to shoot down all his opponents in the numerous plagiarism disputes that he conducted. If he had done it all that early then he definitely had priority and the others were all dastardly scoundrels out to steal his glory. We now know that this was all a fabrication on Newton’s part.

Newton was awarded his BA in 1665 and in the following years he was no different to any highly gifted postgraduate trying to find his feet in the world of academic research. He spread his interests wide reading and absorbing as much of the modern science of the time as he could and making copious notes on what he read as well as setting up ambitious research programmes on a wide range of topics that were to occupy his time for the next thirty years. In the eighteen months before being sent down from Cambridge because of the plague he concentrated his efforts on the new analytical mathematics that had developed over the previous century. Whilst reading widely and bringing himself up to date on material that was not taught at Cambridge he simultaneously extended and developed what he was reading laying the foundations for his version of the calculus. It was no means a completed edifice as NdGT, and unfortunately many others, would have us believe but it was still a very notable mathematical achievement. Over the decades he would return from time to time to his mathematical researches building on and extending that initial foundation. He also didn’t ‘invent’ integral and differential calculus but brought together, codified and extended the work of many others, in particular, Descartes, Fermat, Pascal, Barrow and Wallace, who in turn looked back upon two thousand years of history on the topic.

In the period beginning in 1666 he left off with mathematical endeavours and turned his attention to mechanics mostly addressing the work of Descartes. He made some progress and even wondered, maybe inspired by observing a falling apple in his garden in Woolsthorpe, if the force which causes things to fall the Earth is the same as the force which prevents the Moon from shooting off at a tangent to its orbit. He did some back of an envelope calculations, which showed that they weren’t, due to faulty data and he dropped the matter. He didn’t discover the laws of motion and as he derived the law of gravity from Huygens’ law of centripetal force that was first published in 1673 he certainly didn’t do it before he was thirty. In fact most of the work that went into Newton’s magnum opus the Principia was done in an amazing burst of concentrated effort in the years between 1684 and 1687 when Newton was already over forty.

What Newton did do between 1666 and 1672 was to conduct an extensive experimental programme into physical optics, in particular what he termed the phenomenon of colour. This programme resulted in the construction of the first reflecting telescope and in 1672 Newton’s legendary first paper A Letter of Mr. Isaac Newton, Professor of the Mathematicks in the University of Cambridge; Containing His New Theory about Light and Colors published in the Philosophical Transactions of the Royal Society. Apparently optics doesn’t interest NdGT. Around 1666 Newton also embarked on perhaps his most intensive and longest research programme to discover the secrets of alchemy, whilst starting his life long obsession with the Bible and religion. The last two don’t exactly fit NdGT’s vision of enlightened objective truth.

Newton is without doubt an exceptional figure in the history of science, who has few equals, but like anybody else Newton’s achievements were based on long years of extensive and intensive work and study and are not the result of some sort of scientific miracle in his young years. Telling the truth about Newton’s life and work rather than propagating the myths, as NdGT does, gives students who are potential scientists a much better impression of what it means to be a scientist and is thus in my opinion to be preferred.

As a brief addendum NdGT points out that Newton’s birthday is not actually 25 December (neither is Christ’s by the way) because he was born before the calendar reform was introduced into Britain so we should, if we are logical, be celebrating his birthday on 4 January. NdGT includes the following remark in his explanation, “But the Gregorian Calendar (an awesomely accurate reckoning of Earth’s annual time), introduced in 1584 by Pope Gregory, was not yet adopted in Great Britain.” There is a certain irony in his praise, “an awesomely accurate reckoning of Earth’s annual time”, as this calendar was developed and introduced for purely religious reasons, again not exactly enlightened or objective.





Filed under History of Astronomy, History of Mathematics, History of Physics, History of science, Myths of Science, Renaissance Science

The Queen of Science – The woman who tamed Laplace.

In a footnote to my recent post on the mythologizing of Ibn al-Haytham I briefly noted the inadequacy of the terms Arabic science and Islamic science, pointing out that there were scholars included in these categories who were not Muslims and ones who were not Arabic. In the comments Renaissance Mathematicus friend, the blogger theofloinn, asked, Who were the non-muslim “muslim” scientists? And (aside from Persians) who were the non-Arab “arab” scientists? And then in a follow up comment wrote, I knew about Hunayn ibn Ishaq and the House of Wisdom, but I was not thinking of translation as “doing science.” From the standpoint of the historian of science this second comment is very interesting and reflects a common problem in the historiography of science. On the whole most people regard science as being that which scientists do and when describing its history they tend to concentrate on the big name scientists.

This attitude is a highly mistaken one that creates a falsified picture of scientific endeavour. Science is a collective enterprise in which the ‘scientists’ are only one part of a collective consisting of scientists, technicians, instrument designers and makers, and other supportive workers without whom the scientist could not carry out his or her work. This often includes such ignored people as the secretaries, or in earlier times amanuenses, who wrote up the scientific reports or life partners who, invisible in the background, often carried out much of the drudgery of scientific investigation. My favourite example being William Herschel’s sister and housekeeper, Caroline (a successful astronomer in her own right), who sieved the horse manure on which he bedded his self cast telescope mirrors to polish them.

Translators very definitely belong to the long list of so-called helpers without whom the scientific endeavour would grind to a halt. It was translators who made the Babylonian astronomy and astrology accessible to their Greek heirs thus making possible the work of Eudoxus, Hipparchus, Ptolemaeus and many others. It was translators who set the ball rolling for those Islamic, or if you prefer Arabic, scholars when they translated the treasures of Greek science into Arabic. It was again translators who kicked off the various scientific Renaissances in the twelfth and thirteenth-centuries and again in the fifteenth-century, thereby making the so-called European scientific revolution possible. All of these translators were also more or less scientists in their own right as without a working knowledge of the subject matter that they were translating they would not have been able to render the texts from one language into another. In fact there are many instances in the history of the transmission of scientific knowledge where an inadequate knowledge of the subject at hand led to an inaccurate or even false translation causing major problems for the scholars who tried to understand the texts in the new language. Translators have always been and continue to be an important part of the scientific endeavour.

The two most important works on celestial mechanics produced in Europe in the long eighteenth-century were Isaac Newton’s Philosophiæ Naturalis Principia Mathematica and Pierre-Simon, marquis de Laplace’s Mécanique céleste. The former was originally published in Latin, with an English translation being published shortly after the author’s death, and the latter in French. This meant that these works were only accessible to those who mastered the respective language. It is a fascinating quirk of history that the former was rendered into French and that latter into English in each case by a women; Gabrielle-Émilie Le Tonnelier de Breteuil, Marquise du Châtelet translated Newton’s masterpiece into French and Mary Somerville translated Laplace’s pièce de résistance into English. I have blogged about Émilie de Châtelet before but who was Mary Somerville? (1)


Mary Somerville by Thomas Phillips

Mary Somerville by Thomas Phillips

She was born Mary Fairfax, the daughter of William Fairfax, a naval officer, and Mary Charters at Jedburgh in the Scottish boarders on 26 December 1780. Her parents very definitely didn’t believe in education for women and she spent her childhood wandering through the Scottish countryside developing a lifelong love of nature. At the age of ten, still semi-illiterate, she was sent to Miss Primrose’s boarding school at Musselburgh in Midlothian for one year; the only formal schooling she would ever receive. As a young lady she received lessons in dancing, music, painting and cookery. At the age of fifteen she came across a mathematical puzzle in a ladies magazine (mathematical recreation columns were quite common in ladies magazines in the 18th and 19th-centuries!) whilst visiting friends. Fascinated by the symbols that she didn’t understand, she was informed that it was algebra, a word that meant nothing to her. Later her painting teacher revealed that she could learn geometry from Euclid’s Elements whilst discussing the topic of perspective. With the assistance of her brother’s tutor, young ladies could not buy maths-books, she acquired a copy of the Euclid as well as one of Bonnycastle’s Algebra and began to teach herself mathematics in the secrecy of her bedroom. When her parents discovered this they were mortified her father saying to her mother, “Peg, we must put a stop to this, or we shall have Mary in a strait jacket one of these days. There is X., who went raving mad about the longitude.” They forbid her studies, but she persisted rising before at dawn to study until breakfast time. Her mother eventually allowed her to take some lessons on the terrestrial and celestial globes with the village schoolmaster.

In 1804 she was married off to a distant cousin, Samuel Grieg, like her father a naval officer but in the Russian Navy. He, like her parents, disapproved of her mathematical studies and she seemed condemned to the life of wife and mother. She bore two sons in her first marriage, David who died in infancy and Woronzow, who would later write a biography of Ada Lovelace. One could say fortunately, for the young Mary, her husband died after only three years of marriage in 1807 leaving her well enough off that she could now devote herself to her studies, which she duly did. Under the tutorship of John Wallace, later professor of mathematics in Edinburgh, she started on a course of mathematical study, of mostly French books but covering a wide range of mathematical topic, even tacking Newton’s Principia, which she found very difficult. She was by now already twenty-eight years old. During the next years she became a fixture in the highest intellectual circles of Edinburgh.

In 1812 she married for a second time, another cousin, William Somerville and thus acquired the name under which she would become famous throughout Europe. Unlike her parents and Samuel Grieg, William vigorously encouraged and supported her scientific interests. In 1816 the family moved to London. Due to her Scottish connections Mary soon became a member of the London intellectual scene and was on friendly terms with such luminaries as Thomas Young, Charles Babbage, John Herschel and many, many others; all of whom treated Mary as an equal in their wide ranging scientific discussions. In 1817 the Somervilles went to Paris where Mary became acquainted with the cream of the French scientists, including Biot, Arago, Cuvier, Guy-Lussac, Laplace, Poisson and many more.

In 1824 William was appointed Physician to Chelsea Hospital where Mary began a series of scientific experiments on light and magnetism, which resulted in a first scientific paper published in the Philosophical Transactions of the Royal Society in 1826. In 1836, a second piece of Mary’s original research was presented to the Académie des Sciences by Arago. The third and last of her own researches appeared in the Philosophical Transactions in 1845. However it was not as a researcher that Mary Somerville made her mark but as a translator and populariser.

In 1827 Henry Lord Brougham and Vaux requested Mary to translate Laplace’s Mécanique céleste into English for the Society for the Diffusion of Useful Knowledge. Initially hesitant she finally agreed but only on the condition that the project remained secret and it would only be published if judged fit for purpose, otherwise the manuscript should be burnt. She had met Laplace in 1817 and had maintained a scientific correspondence with him until his death in 1827. The translation took four years and was published as The Mechanism of the Heavens, with a dedication to Lord Brougham, in 1831. The manuscript had been refereed by John Herschel, Britain’s leading astronomer and a brilliant mathematician, who was thoroughly cognisant with the original, he found the translation much, much more than fit for the purpose. Laplace’s original text was written in a style that made it inaccessible for all but the best mathematicians, Mary Somerville did not just translate the text but made it accessible for all with a modicum of mathematics, simplifying and elucidating as she went. This wasn’t just a translation but a masterpiece. The text proved too vast for Brougham’s Library of Useful Knowledge but on the recommendation of Herschel, the publisher John Murray published the book at his own cost and risk promising the author two thirds of the profits. The book was a smash hit the first edition of 750 selling out almost instantly following glowing reviews by Herschel and others. In honour of the success the Royal Society commissioned a bust of Mrs Somerville to be placed in their Great Hall, she couldn’t of course become a member!

At the age of fifty-one Mary Somerville’s career as a science writer had started with a bang. Her Laplace translation was used as a textbook in English schools and universities for many years and went through many editions. Her elucidatory preface was extracted and published separately and also became a best seller. If she had never written another word she would still be hailed as a great translator and science writer but she didn’t stop here. Over the next forty years Mary Somerville wrote three major works of semi-popular science On the Connection of the Physical Sciences (1st ed. 1834), Physical Geography (1st ed. 1848), (she was now sixty-eight years old!) and at the age of seventy-nine, On Molecular and Microscopic Science (1st ed. 1859). The first two were major successes, which went through many editions each one extended, brought up to date, and improved. The third, which she later regretted having published, wasn’t as successful as her other books. Famously, in the history of science, William Whewell in his anonymous 1834 review of On the Connection of the Physical Sciences first used the term scientist, which he had coined a year earlier, in print but not, as is oft erroneously claimed, in reference to Mary Somerville.

Following the publication of On the Connection of the Physical Sciences Mary Somerville was awarded a state pension of £200 per annum, which was later raised to £300. Together with Caroline Herschel, Mary Somerville became the first female honorary member of the Royal Astronomical Society just one of many memberships and honorary memberships of learned societies throughout Europe and America. Somerville College Oxford, founded seven years after her death, was also named in her honour. She died on 28 November 1872, at the age of ninety-one, the obituary which appeared in the Morning Post on 2 December said, “Whatever difficulty we might experience in the middle of the nineteenth century in choosing a king of science, there could be no question whatever as to the queen of science.” The Times of the same date, “spoke of the high regard in which her services to science were held both by men of science and by the nation”.

As this is my contribution to Ada Lovelace day celebrating the role of women in the history of science, medicine, engineering, mathematics and technology I will close by mentioning the role that Mary Somerville played in the life of Ada. A friend of Ada’s mother, the older women became a scientific mentor and occasional mathematics tutor to the young Miss Byron. As her various attempts to make something of herself in science or mathematics all came to nought Ada decided to take a leaf out of her mentor’s book and to turn to scientific translating. At the suggestion of Charles Wheatstone she chose to translate Luigi Menabrea’s essay on Babbage’s Analytical Engine, at Babbage’s suggestion elucidating the original text as her mentor had elucidated Laplace and the rest is, as they say, history. I personally would wish that the founders of Ada Lovelace Day had chosen Mary Somerville instead, as their galleon figure, as she contributed much, much more to the history of science than her feted protégée.

(1) What follows is largely a very condensed version of Elizabeth  C. Patterson’s excellent Somerville biography Mary Somerville, The British Journal for the History of Science, Vol. 4, 1969, pp. 311-339



Filed under History of Astronomy, History of Mathematics, History of Physics, History of science, Ladies of Science

The unfortunate backlash in the historiography of Islamic science

Anybody with a basic knowledge of the history of Western science will know that there is a standard narrative of its development that goes something like this. Its roots are firmly planted in the cultures of ancient Egypt and Babylon and it bloomed for the first time in ancient Greece, reaching a peak in the work of Ptolemaeus in astronomy and Galen in medicine in the second-century CE. It then goes into decline along with the Roman Empire effectively disappearing from Europe by the fifth-century CE. It began to re-emerge in the Islamic Empire[1] in the eight-century CE from whence it was brought back into Europe beginning in the twelfth-century CE. In Europe it began to bloom again in the Renaissance transforming into modern science in the so-called Scientific Revolution in the seventeenth-century. There is much that is questionable in this broad narrative but that is not the subject of this post.

In earlier versions of this narrative, its European propagators claimed that the Islamic scholars who appropriated Greek knowledge in the eighth-century and then passed it back to their European successors, beginning in the twelfth-century, only conserved that knowledge, effectively doing nothing with it and not increasing it. For these narrators their heroes of science were either ancient Greeks or Early Modern Europeans; Islamic scholars definitely did not belong to the pantheon. However, a later generation of historians of science began to research the work of those Islamic scholars, reading, transcribing, translating and analysing their work and showing that they had in fact made substantial contributions to many areas of science and mathematics, contributions that had flowed into modern European science along with the earlier Greek, Babylonian and Egyptian contributions. Also Islamic scholars such as al-Biruni, al-Kindi, al-Haytham, Ibn Sina, al-Khwarizmi and many others were on a level with such heroes of science as Archimedes, Ptolemaeus, Galen or Kepler, Galileo and Newton. Although this work redressed the balance there is still much work to be done on the breadth and deep of Islamic science.

Unfortunately the hagiographic, amateur, wannabe pop historians of science now entered the field keen to atone for the sins of the earlier Eurocentric historical narrative and began to exaggerate the achievements of the Islamic scholars to show how superior they were to the puny Europeans who stole their ideas, like the colonial bullies who stole their lands. There came into being a type of hagiographical popular history of Islamic science that owes more to the Thousand and One Nights than it does to any form of serious historical scholarship. I came across an example of this last week during the Gravity Fields Festival, an annual shindig put on in Grantham to celebrate the life and work of one Isaac Newton, late of that parish.

On Twitter Ammār ibn Aziz Ahmed (@Ammar_Ibn_AA) tweeted the following:

I’m sorry to let you know that Isaac Newton learned about gravity from the books of Ibn al-Haytham

I naturally responded in my usual graceless style that this statement was total rubbish to which Ammār ibn Aziz Ahmed responded with a link to his ‘source

I answered this time somewhat more moderately that a very large part of that article is quite simply wrong. One of my Internet friends, a maths librarian (@MathsBooks) told me I was being unfair and that I should explain what was wrong with his source, so here I am.

The article in question is one of many potted biographies of al-Haytham that you can find dotted all other the Internet and which are mostly virtual clones of each other. They all contain the same collection of legends, half-truths, myths and straightforward lies usually without sources, or, as in this case, quoting bad popular books written by a non-historian as their source. It is fairly obvious that they all plagiarise each other without bothering to consult original sources or the work done by real historian of science on the life and work of al-Haytham.

The biography of al-Haytham is, like that of most medieval Islamic scholars, badly documented and very patchy at best. Like most popular accounts this article starts with the legend of al-Haytham’s feigned madness and ten-year incarceration. This legend is not mentioned in all the biographical sources and should be viewed with extreme scepticism by anybody seriously interested in the man and his work. The article then moves on to the most pernicious modern myth concerning al-Haytham that he was the ‘first real scientist’.

This claim is based on a misrepresentation of what al-Haytham did. He did not as the article claims introduce the scientific method, whatever that might be. For a limited part of his work al-Haytham used experiments to prove points, for the majority of it he reasoned in exactly the same way as the Greek philosophers whose heir he was. Even where he used the experimental method he was doing nothing that could not be found in the work of Archimedes or Ptolemaeus. There is also an interesting discussion outlined in Peter Dear’s Discipline and Experience (1995) as to whether al-Haytham used or understood experiments in the same ways as researchers in the seventeenth-century; Dear concludes that he doesn’t. (pp. 51-53) It is, however, interesting to sketch how this ‘misunderstanding’ came about.

The original narrative of the development of Western science not only denied the contribution of the Islamic Empire but also claimed that the Middle Ages totally rejected science, modern science only emerging after the Renaissance had reclaimed the Greek scientific inheritance. The nineteenth-century French physicist and historian of science, Pierre Duhem, was the first to challenge this fairy tale claiming instead, based on his own researches, that the Scientific Revolution didn’t take place in the seventeenth–century but in the High Middle Ages, “the mechanics and physics of which modern times are justifiably proud to proceed, by an uninterrupted series of scarcely perceptible improvements, from doctrines professed in the heart of the medieval schools.” After the Second World War Duhem’s thesis was modernised by the Australian historian of science, Alistair C. Crombie, whose studies on medieval science in general and Robert Grosseteste in particular set a new high water mark in the history of science. Crombie attributed the origins of modern science and the scientific method to Grosseteste and Roger Bacon in the twelfth and thirteenth-centuries. A view that has been somewhat modified and watered down by more recent historians, such as David Lindberg. Enter Matthias Schramm.

Matthias Schramm was a German historian of science who wrote his doctoral thesis on al-Haytham. A fan of Crombie’s work Schramm argued that the principle scientific work of Grosseteste and Bacon in physical optics was based on the work of al-Haytham, correct for Bacon not so for Grosseteste, and so he should be viewed as the originator of the scientific method and not they. He makes this claim in the introduction to his Ibn al-Haythams Weg zur Physik (1964), but doesn’t really substantiate it in the book itself. (And yes, I have read it!) Al-Haytham’s use of experiment is very limited and to credit him with being the inventor of the scientific method is a step too far. However since Schramm made his claims they have been expanded, exaggerated and repeated ad nauseam by the al-Haytham hagiographers.

We now move on to what is without doubt al-Haytham’s greatest achievement his Book of Optics, the most important work on physical optics written between Ptolemaeus in the second-century CE and Kepler in the seventeenth-century. Our author writes:

In his book, The Book of Optics, he was the first to disprove the ancient Greek idea that light comes out of the eye, bounces off objects, and comes back to the eye. He delved further into the way the eye itself works. Using dissections and the knowledge of previous scholars, he was able to begin to explain how light enters the eye, is focused, and is projected to the back of the eye.

Here our author demonstrates very clearly that he really has no idea what he is talking about. It should be very easy to write a clear and correct synopsis of al-Haytham’s achievements, as there is a considerable amount of very good literature on his Book of Optics, but our author gets it wrong[2].

Al-Haytham didn’t prove or disprove anything he rationally argued for a plausible hypothesis concerning light and vision, which was later proved to be, to a large extent, correct by others. The idea that vision consists of rays (not light) coming out of the eyes (extramission) is only one of several ideas used to explain vision by Greek thinkers. That vision is the product of light entering the eyes (intromission) also originates with the Greeks. The idea that light bounces off every point of an object in every direction comes from al-Haytham’s Islamic predecessor al-Kindi. Al-Haytham’s great achievement was to combine an intromission theory of vision with the geometrical optics of Euclid, Heron and Ptolemaeus (who had supported an extramission theory) integrating al-Kindi’s punctiform theory of light reflection. In its essence, this theory is fundamentally correct. The second part of the paragraph quoted above, on the structure and function of the eye, is pure fantasy and bears no relation to al-Haytham’s work. His views on the subject were largely borrowed from Galen and were substantially wrong.

Next up we have the pinhole camera or better camera obscura, although al-Haytham was probably the first to systematically investigate the camera obscura its basic principle was already known to the Chinese philosopher Mo-Ti in the fifth-century BCE and Aristotle in the fourth-century BCE. The claims for al-Haytham’s studies of atmospheric refraction are also hopelessly exaggerated.

We the have an interesting statement on the impact of al-Haytham’s optics, the author writes:

The translation of The Book of Optics had a huge impact on Europe. From it, later European scholars were able to build the same devices as he did, and understand the way light works. From this, such important things as eyeglasses, magnifying glasses, telescopes, and cameras were developed.

The Book of Optics did indeed have a massive impact on European optics in Latin translation from the work of Bacon in the thirteenth-century up to Kepler in the seventeenth-century and this is the principle reason why he counts as one of the very important figures in the history of science, however I wonder what devices the author is referring to here, I know of none. Interesting in this context is that The Book of Optics appears to have had very little impact on the development of physical optics in the Islamic Empire. One of the anomalies in the history of science and technology is the fact that as far was we know the developments in optical physics made by al-Haytham, Bacon, Witelo, Kepler et al had no influence on the invention of optical instruments, glasses, magnifying glasses, the telescope, which were developed along a parallel but totally separate path.

Moving out of optics we get told about al-Haytham’s work in astronomy. It is true that he like many other Islamic astronomers criticised Ptolemaeus and suggested changes in his system but his influence was small in comparison to other Islamic astronomers. What follows is a collection of total rubbish.

He had a great influence on Isaac Newton, who was aware of Ibn al-Haytham’s works.

He was not an influence on Newton. Newton would have been aware of al-Haytham’s work in optics but by the time Newton did his own work in this field al-Haytham’s work had been superseded by that of Kepler, Scheiner, Descartes and Gregory amongst others.

He studied the basis of calculus, which would later lead to the engineering formulas and methods used today.

Al-Haytham did not study the basis of calculus!

He also wrote about the laws governing the movement of bodies (later known as Newton’s 3 laws of motion)

Like many others before and after him al-Haytham did discuss motion but he did not come anywhere near formulating Newton’s laws of motion, this claim is just pure bullshit.

and the attraction between two bodies – gravity. It was not, in fact, the apple that fell from the tree that told Newton about gravity, but the books of Ibn al-Haytham.

We’re back in bullshit territory again!

If anybody thinks I should give a more detailed refutation of these claims and not just dismiss them as bullshit, I can’t because al-Haytham never ever did the things being claimed. If you think he did then please show me where he did so then I will be prepared to discuss the matter, till then I’ll stick to my bullshit!

I shall examine one more claim from this ghastly piece of hagiography. Our author writes the following:

When his books were translated into Latin as the Spanish conquered Muslim lands in the Iberian Peninsula, he was not referred to by his name, but rather as “Alhazen”. The practice of changing the names of great Muslim scholars to more European sounding names was common in the European Renaissance, as a means to discredit Muslims and erase their contributions to Christian Europe.

Alhazen is merely the attempt by the unknown Latin translator of The Book of Optics to transliterate the Arabic name al-Haytham there was no discrimination intended or attempted.

Abū ʿAlī al-Ḥasan ibn al-Ḥasan ibn al-Haytham is without any doubt an important figure in the history of science whose contribution, particularly those in physical optics, should be known to anybody taking a serious interest in the subject, but he is not well served by inaccurate, factually false, hagiographic crap like that presented in the article I have briefly discussed here.






[1] Throughout this post I will refer to Islamic science an inadequate but conventional term. An alternative would be Arabic science, which is equally problematic. Both terms refer to the science produced within the Islamic Empire, which was mostly written in Arabic, as European science in the Middle Ages was mostly written in Latin. The terms do not intend to imply that all of the authors were Muslims, many of them were not, or Arabs, again many of them were not.

[2] For a good account of the history of optics including a detailed analysis of al-Haytham’s contributions read David C. Lindberg’s Theories of Vision: From al-Kindi to Kepler, University of Chicago Press, 1976.


Filed under History of Optics, History of Physics, Mediaeval Science, Myths of Science, Renaissance Science

Published on…

Today I have been mildly irritated by numerous tweets announcing the 5th July 1687, as the day on which Isaac Newton’s Principia was published, why? Partially because the claim is not strictly true and partially because it evokes a false set of images generated by the expression, published on, in the current age.

In the last couple of decades we have become used to images of hoards of teens dressed in fantasy costumes as witches queuing up in front of large bookstores before midnight to participate in the launch of the latest volume of a series of children’s books on a juvenile wizard and his adventures. These dates were the days on which the respective volumes were published and although the works of other authors do not enjoy quite the same level of turbulence, they do also have an official publication date, usually celebrated in some suitable way by author and publisher. Historically this has not always been the case.

In earlier times books, particularly ones of a scientific nature, tended to dribble out into public awareness over a vague period of time rather than to be published on a specific date. There were no organised launches, no publisher’s parties populated by the glitterati of the age and no official publication date. Such books were indeed published in the sense of being made available to the reading public but the process was much more of a slapdash affair than that which the term evokes today.

One reason for this drawn out process of release was the fact that in the early centuries of the printed book they were often not bound for sale by the publisher. Expensive works of science were sold as an unbound pile of printed sheets, allowing the purchaser to have his copy bound to match the other volumes in his library. This meant that there were not palettes of finished bound copies that could be shipped off to the booksellers. Rather a potential purchaser would order the book and its bindings and wait for it to be finished for delivery.

Naturally historians of science love to be able to nail the appearance of some game changing historical masterpiece to a specific date, however this is not always possible. In the case of Copernicus’ De revolutionibus, for example, we are fairly certain of the month in 1543 that Petreius started shipping finished copies of the work but there is no specific date of publication. With other equally famous works, such as Galileo’s Sidereus Nuncius, the historian uses the date of signing of the dedication as a substitute date of publication.

So what is with Newton’s Principia does it have an official date of publication and if not why are so many people announcing today to be the anniversary of its publication. Principia was originally printed written in manuscript in three separate volumes and Edmond Halley, who acted both as editor and publisher, had to struggle with the cantankerous author to get those volumes out of his rooms in Cambridge and into the printing shop. In fact due to the interference of Robert Hooke, demanding credit for the discovery of the law of gravity, Newton contemplated not delivering the third volume at all. Due to Halley’s skilful diplomacy this crisis was mastered and the final volume was delivered up by the author and put into print. July 5th 1687 is not the date of publication as it is understood today, but the date of a letter that Halley sent to Newton announcing that the task of putting his immortal masterpiece onto the printed page had finally been completed and that he was sending him twenty copies for his own disposition. I reproduce the text of Halley’s letter below.


Honoured Sr

I have at length brought you Book to an end, and hope it will please you. the last errata came just in time to be inserted. I will present from you the books you desire to the R. Society, Mr Boyle, Mr Pagit, Mr Flamsteed and if there be any elce in town that you design to gratifie that way; and I have sent you to bestow on your friends in the University 20 Copies, which I entreat you to accept.[1]



[1] Richard S. Westfall, Never at Rest: A Biography of Isaac Newton, Cambridge University Press, Cambridge etc., 1980, p. 468.


Filed under Early Scientific Publishing, History of Astronomy, History of Physics, Myths of Science, Newton

Niels & Me: Dysgraphia – A history of science footnote.

One of the symptoms that, I think most, sufferers from mental illness share is the feeling of being alone with their daemons. “I’m the only one who feels like this!” “Why have I alone been afflicted?” This feeling of isolation and of having been somehow singled out for punishment in itself causes mental distress and deepens the crisis. An important step along the road to recovery is the realisation that one is not alone, that there are others who suffer similarly, that one hasn’t been singled out. I can still remember very clearly the day when I became certain that I am an adult ADD sufferer and a lot of my symptoms, including several that I didn’t regard as part of my illness, fell into place, received a label and a possible path back to mental health. As I have already related in my previous post I had very similar feelings on discovering dysgraphia and realising that it was one of my central daemons. One of those revelations concerning dysgraphia actually has a close connection to my history of science obsession and as this is a history of science blog I would like to tell the story here.

As should be clear from the name of this blog my main interest as a historian of science lies with the mathematical sciences in the Early Modern Period, however I try not to be too narrow and get stuck in a historical cul-de-sac, only able to understand a very narrow field of science over a very short period of time. In order to maintain a broad overview of the history of science I buy and read general surveys of the histories of other disciplines in other periods. One such book that I own is Robert P. Crease and Charles C. Mann The Second Creation: Makers of the Revolution in Twentieth-Century Physics[1], which, if my memory serves me correctly, I bought on the recommendation of dog owner, physics blogger and popular science book author Chad Orzel; a recommendation that I would endorse. I vividly remember, shortly after I bought it, curling up in bed with the book for my half hour read before going to sleep and waking up rather than dosing off, as I read the revelatory words on the first pages of chapter two, The Man Who Talked. I’m now going quote some fairly large chunks of those pages:

Bohr’ working habits have become legendary among his successors, part of the lore of science along with Einstein’s flyaway hair and Rutherford’s remark that relativity was not meant to be understood by Anglo-Saxons. Bohr talked. [emphasis in original] He discovered his ideas in the act of enunciating them, shaping thoughts as they came out of his mouth. Friends, colleagues, graduate students, all had Bohr gently entice them into long walks in the countryside around Copenhagen, the heavy clouds scudding overhead as Bohr thrust his hands into his overcoat pockets and settled into an endless, hesitant, recondite, barely audible monologue. While he spoke, he watched his listeners’ reactions, eager to establish a bond in a shared effort to articulate. Whispered phrases would be pronounced, only to be adjusted as Bohr struggled to express exactly [emphasis in original] what he meant; words were puzzled over, repeated, then tossed aside, and he was always ready to add a qualification, to modify, a remark, to go back to the beginning, to start the explanation over again. Then flatteringly, he would abruptly thrust the subject on his listener – surely this cannot be all? what else is there? – his big, ponderous, heavy-lidded eyes intent on the response. Before it could come, however, Bohr would have started talking again, wrestling with the answer himself. He inspected the language with which an idea was expressed in the way a jeweller inspects an unfamiliar stone, slowly judging each facet by holding it before an intense light[2].

Now I would never be so presumptuous to compare myself to Niels Bohr but this paragraph resonated with me on so many levels that I almost felt sick with excitement when I read it. With slight differences that is how I think, discover, formulate my ideas and my theories. In more recent years I sometimes feel really sorry for my listeners and try to throttle back the waterfall of words that pour out of my mouth; in earlier years I was not aware of my, basically anti-social, behaviour lost in that stream of consciousness word flow. However it was a paragraph two thirds of the way down the following page that made me sit bolt upright in bed.

As a schoolboy, Bohr’s worst subject had been Danish composition, and for the rest of his life he passed up no opportunity to avoid putting pen to paper. He dictated his entire doctoral dissertation to his mother, causing family rows when his father insisted that the budding Ph. D. should be forced to learn to write for himself; Bohr’s mother remained firm in her belief that the task was hopeless. It apparently was – most of Bohr’s later work and correspondence were dictated to his wife and a succession of secretaries and collaborators. Even with this assistance, it took him months to put together articles. Reading of his struggles, it is hard not to wonder if he was dyslexic[3]. [my emphasis]

I’m not a big fan of historical diagnosis by hearsay of illnesses that one or other famous figure from the past might have suffered. You could write an entire medical dictionary containing all the complaints that researchers have decided that the artist Van Gough suffered, according to their interpretation of the available facts. However my own personal situation leads me to the conclusion that Messrs. Crease and Mann are wrong and that Niels Bohr was not dyslexic but dysgraphic.

If you suffer from a disability that has caused you years of mental stress, then to discover that a famous historical figure suffered from the same ailment and despite this handicap was successful can be an incredible boost. Knowing that Bohr needed assistance to write his papers takes away some of the shame that I feel in having to ask people to check and correct the things that I write, as I said at the beginning, it’s knowing that you’re not alone.




[1] Robert P. Crease and Charles C. Mann,The Second Creation: Makers of the Revolution in Twentieth-Century Physics, Rutgers University Press, New Brunswick, New Jersey, Revised ed., 1996.

[2]Crease & Mann p. 20

[3]Crease & Mann p. 21


Filed under Autobiographical, History of Physics, History of science

Science grows on the fertilizer of disagreement

At the weekend German television presented me with all three episodes of Jim Al-Khalili’s documentary on the history of electricity, Shock and Awe: The Story of Electricity. On the whole I found it rather tedious largely because I don’t like my science or history of science served up by a star presenter who is the centre of the action rather than the science itself, a common situation with the documentaries of ‘he who shall not be named’-TPBoPS, and NdGT. It seems that we are supposed to learn whatever it is that the documentary nominally offers by zooming in on the thoughtful features of the presenter, viewing his skilfully lit profile or following him as he walks purposefully, thoughtfully, meaningfully or pensively through the landscape. What comes out is “The Brian/Neil/Jim Show” with added science on the side, which doesn’t really convince me, but maybe I’m just getting old.

However my criticism of the production style of modern television science programmes is not the real aim of this post, I’m much more interested in the core of the first episode of Al-Khalili’s documentary. The episode opened and closed with the story of Humphrey Davy constructing the, then, largest battery in the world in the cellars of the Royal Institution in order to make the first ever public demonstration of an arc lamp and thus to spark the developments that would eventually lead to electric lighting. Having started here the programme moved back in time to the electrical experiments of Francis Hauksbee at the Royal Society under the auspices of Isaac Newton. Al-Khalili then followed the development of electrical research through the eighteenth-century, presenting the work of the usual suspects, Steven Gray, Benjamin Franklin etc., until we arrived at the scientific dispute between the two great Italian physicists Luigi Galvani and Alessandro Volta that resulted in the invention of the Voltaic pile, the forerunner of the battery and the first producer of an consistent electrochemical current. All of this was OK and I have no real criticisms, although I was slightly irked by constant references to ‘Hauksbee’s’ generator when the instrument in question was an adaption suggested by Newton of an invention from Otto von Guericke, who didn’t get a single name check. What did irritate me and inspired this post was the framing of the Galvani-Volta dispute.

Al-Khalili, a gnu atheist of the milder variety, presented this as a conflict between irrational religious persuasion, Galvani, and rational scientific heuristic, Volta, culminating in a victory for science over religion. In choosing so to present this historical episode Al-Khalili, in my opinion, missed a much more important message in scientific methodology, which was in fact spelt out in the fairly detailed presentation of the successive stages of the dispute. Galvani made his famous discovery of twitching frog’s legs and after a series of further experiments published his theory of animal electricity. Volta was initially impressed by Galvani’s work and at first accepted his theory. Upon deeper thought he decided Galvani’s interpretation of the observed phenomena was wrong and conducted his own series of result to prove Galvani wrong and establish his own theory. Volta having published his refutation of Galvani’s theory, the latter not prepared to abandon his standpoint also carried out a series of new experiments to prove his opponent wrong and his own theory right. One of these experiments led Volta to the right explanation, within the knowledge framework of the period, and to the discovery of the Voltaic pile. What we see here is a very important part of scientific methodology, researchers holding conflicting theories spurring each other on to new discoveries and deeper knowledge of the field under examination. The heuristics of the two are almost irrelevant, what is important here is the disagreement as research motor. Also very nicely illustrated is discovery as an evolutionary process spread over time rather than the infamous eureka moment.

The inspiration produced from watching Al-Khalili’s story of the invention of the battery chimes in very nicely with another post I was planning on writing. In a recent blog post, Joe Hanson of “it’s OKAY to be SMART” wrote about Galileo and the first telescopic observations of sunspots at the beginning of the seventeenth-century. The post is OK as far as it goes, even managing to give credit to Thomas Harriot and Johannes Fabricius, however it contains one truly terrible sentence that caused my heckles to rise. Hanson wrote:

Although Galileo’s published sunspot work was the most important of its day, on account of the “that’s no moon” smackdown it delivered to the Jesuit scientific community, G-dub was not the first to observe the solar speckles.

Here we have another crass example of modern anti-religious sentiment of a science writer getting in the way of sensible history of science. What we are talking about here is not the Jesuit scientific community but the single Jesuit physicist and astronomer Christoph Scheiner, who famously became embroiled in a dispute on the nature of sunspots with Galileo. Once again we also have an excellent example of scientific disagreement driving the progress of scientific research. Scheiner and Galileo discovered sunspots with their telescopes independently of each other at about the same time and it was Scheiner who first published the results of his discoveries together with an erroneous theory as to the nature of sunspots. Galileo had at this point not written up his own observations, let alone developed a theory to explain them. Spurred on by Scheiner’s publication he now proceeded to do so, challenging Scheiner’s claim that the sunspots where orbiting the sun and stating instead that they were on the solar surface. An exchange of views developed with each of the adversaries making new observations and calculations to support their own theories. Galileo was not only able to demonstrate that sunspots were on the surface of the sun but also to prove that the sun was rotating on its axis, as already hypothesised by Johannes Kepler. Scheiner, an excellent astronomer and mathematician, accepted Galileo’s proofs and graciously acknowledge defeat. However whereas Galileo now effectively gave up his solar observations Scheiner developed new sophisticated observation equipment and carried out an extensive programme of solar research in which he discovered amongst other things that the sun’s axis is tilted with respect to the ecliptic. Here again we have two first class researchers propelling each other to new important discoveries because of conflicting views on how to interpret observed phenomena.

My third example of disagreement as a driving force in scientific discovery is not one that I’ve met recently but one whose misrepresentation has annoyed me for many years, it concerns Albert Einstein and quantum mechanics. I have lost count of the number of times that I’ve read some ignorant know-it-all mocking Einstein for having rejected quantum mechanics. That Einstein vehemently rejected the so-called Copenhagen interpretation of quantum mechanics is a matter of record but his motivation for doing so and the result of that rejection is often crassly misrepresented by those eager to score one over the great Albert. Quantum mechanics as initial presented by Niels Bohr, Erwin Schrödinger, Werner Heisenberg et. al. contradicted Einstein fundamental determinist metaphysical concept of physics. It was not that he didn’t understand it, after all he had made several significant contributions to its evolution, but he didn’t believe it was a correct interpretation of the real physical world. Einstein being Einstein he didn’t just sit in the corner and sulk but actively searched for weak points in the new theory trying to demonstrate its incorrectness. There developed a to and fro between Einstein and Bohr, with the former picking holes in the theory and the latter closing them up again. Bohr is on record as saying that Einstein through his informed criticism probably contributed more to the development of the new theory than any other single physicist. The high point of Einstein’s campaign against quantum mechanics was the so-called EPR (Einstein-Podolsky-Rosen) paradox, a thought experiment, which sought to show that quantum mechanics as it stood would lead to unacceptable or even impossible consequences. On the basis of EPR the Irish physicist John Bell developed a testable theorem, which when tested showed quantum mechanics to be basically correct and Einstein wrong, a major step forward in the establishment of quantum physics. Although proved wrong in the end Einstein’s criticism of and disagreement with quantum mechanics contributed immensely to the theories evolution.

The story time popular presentations of the history of science very often presents the progress of science as a series of eureka moments achieved by solitary geniuses, their results then being gratefully accepted by the worshiping scientific community. Critics who refuse to acknowledge the truth of the new discoveries are dismissed as pitiful fools who failed to understand. In reality new theories almost always come into being in an intellectual conflict and are tested, improved and advanced by that conflict, the end result being the product of several conflicting minds and opinions struggling with the phenomena to be explained over, often substantial, periods of time and are not the product of a flash of inspiration by one single genius. As the title says, science grows on the fertilizer of disagreement.


Filed under History of Astronomy, History of Physics, History of science, Myths of Science

What Isaac actually asked the apple.

Yesterday on my twitter stream people were retweeting the following quote:

“Millions saw the apple fall, but Newton asked why.” —Bernard Baruch

For those who don’t know, Bernard Baruch was an American financier and presidential advisor. I can only assume that those who retweeted it did so because they believe that it is in some way significant. As a historian of science I find it is significant because it is fundamentally wrong in two different ways and because it perpetuates a false understanding of Newton’s apple story. For the purposes of this post I shall ignore the historical debate about the truth or falsity of the apple story, an interesting discussion of which you can read here in the comments, and just assume that it is true. I should however point out that in the story, as told by Newton to at least two different people, he was not hit on the head by the apple and he did not in a blinding flash of inspiration discover the inverse square law of gravity. Both of these commonly held beliefs are myths created in the centuries after Newton’s death.

Our quote above implies that of all the millions of people who saw apples, or any other objects for that matter, fall, Newton was the first or even perhaps the only one to ask why. This is of course complete and utter rubbish people have been asking why objects fall probably ever since the hominoid brain became capable of some sort of primitive thought. In the western world the answer to this question that was most widely accepted in the centuries before Newton was born was the one supplied by Aristotle. Aristotle thought that objects fall because it was in their nature to do so. They had a longing, desire, instinct or whatever you choose to call it to return to their natural resting place the earth. This is of course an animistic theory of matter attributing as it does some sort of spirit to matter to fulfil a desire.

Aristotle’s answer stems from his theory of the elements of matter that he inherited from Empedocles. According to this theory all matter on the earth consisted of varying mixtures of four elements: earth, water, fire and air. In an ideal world they would be totally separated, a sphere of earth enclosed in a sphere of water, enclosed in a sphere of air, which in turn was enclosed in a sphere of fire. Outside of the sphere of fire the heavens consisted of a fifth pure element, aether or as it became known in Latin the quintessence. In our world objects consist of mixtures of the four elements, which given the chance strive to return to their natural position in the scheme of things. Heavy objects, consisting as they do largely of earth and water, strive downwards towards the earth light objects such as smoke or fire strive upwards.

To understand what Isaac did ask the apple we have to take a brief look at the two thousand years between Aristotle and Newton.

Ignoring for a moment the Stoics, nobody really challenged the Aristotelian elemental theory, which is metaphysical in nature but over the centuries they did challenge his physical theory of movement. Before moving on we should point out that Aristotle said that vertical, upwards or downwards, movement on the earth was natural and all other movement was unnatural or violent, whereas in the heavens circular movement was natural.

Already in the sixth century CE John Philoponus began to question and criticise Aristotle’s physical laws of motion. An attitude that was taken up and extended by the Islamic scholars in the Middle Ages. Following the lead of their Islamic colleagues the so-called Paris physicists of the fourteenth century developed the impulse theory, which said that when an object was thrown the thrower imparted an impulse to the object which carried it through the air gradually being exhausted, until when spent the object fell to the ground. Slightly earlier their Oxford colleagues, the Calculatores of Merton College had in fact discovered Galileo’s mathematical law of fall: The two theories together providing a quasi-mathematical explanation of movement, at least here on the earth.

You might be wondering what all of this has to do with Isaac and his apple but you should have a little patience we will arrive in Grantham in due course.

In the sixteenth century various mathematicians such as Tartaglia and Benedetti extended the mathematical investigation of movement, the latter anticipating Galileo in almost all of his famous discoveries. At the beginning of the seventeenth century Simon Stevin and Galileo deepened these studies once more the latter developing very elegant experiments to demonstrate and confirm the laws of fall, which were later in the century confirmed by Riccioli. Meanwhile their contemporary Kepler was the first to replace the Aristotelian animistic concept of movement with one driven by a non-living force, even if it was not very clear what force is. During the seventeenth century others such as Beeckman, Descartes, Borelli and Huygens further developed Kepler’s concept of force, meanwhile banning Aristotle’s moving spirits out of their mechanistical philosophy. Galileo, Beeckman and Descartes replaced the medieval impulse theory with the theory of inertia, which says that objects in a vacuum will either remain at rest or continue to travel in a straight line unless acted upon by a force. Galileo, who still hung on the Greek concept of perfect circular motion, had problems with the straight-line bit but Beeckman and Descartes straightened him out. The theory of inertia was to become Newton’s first law of motion.

We have now finally arrived at that idyllic summer afternoon in Grantham in 1666, as the young Isaac Newton, home from university to avoid the plague, whilst lying in his mother’s garden contemplating the universe, as one does, chanced to see an apple falling from a tree. Newton didn’t ask why it fell, but set off on a much more interesting, complicated and fruitful line of speculation. Newton’s line of thought went something like this. If Descartes is right with his theory of inertia, in those days young Isaac was still a fan of the Gallic philosopher, then there must be some force pulling the moon down towards the earth and preventing it shooting off in a straight line at a tangent to its orbit. What if, he thought, the force that holds the moon in its orbit and the force that cause the apple to fall to the ground were one and the same? This frighteningly simple thought is the germ out of which Newton’s theory of universal gravity and his masterpiece the Principia grew. That growth taking several years and a lot of very hard work. No instant discoveries here.

Being somewhat of a mathematical genius, young Isaac did a quick back of an envelope calculation and see here his theory didn’t fit! They weren’t the same force at all! What had gone wrong? In fact there was nothing wrong with Newton’s theory at all but the figure that he had for the size of the earth was inaccurate enough to throw his calculations. As a side note, although the expression back of an envelope calculation is just a turn of phrase in Newton’s case it was often very near the truth. In Newton’s papers there are mathematical calculations scribbled on shopping lists, in the margins of letters, in fact on any and every available scrap of paper that happened to be in the moment at hand.

Newton didn’t forget his idea and later when he repeated those calculations with the brand new accurate figures for the size of the earth supplied by Picard he could indeed show that the chain of thought inspired by that tumbling apple had indeed been correct.



Filed under History of Astronomy, History of Mathematics, History of Physics, History of science, Myths of Science, Newton