Top 10 science anniversaries in 2020
The discovery of electromagnetism and X-rays, plus the first atomic bomb, made this year’s list
Contributing Correspondent
2020, the International Year of Good Vision, is also a good year for scientific anniversaries.
As usual, there are the birthday anniversaries, offering an opportunity to recognize some of the great scientists of the past for their contributions to humankind’s collective knowledge. And there are the anniversaries of accomplishments, discoveries or events that left the world a different place than it had been before. There’s even an Einstein anniversary, which there almost always is.
What’s more, by selecting the Top 10 anniversaries carefully, you can illustrate how often key scientific concepts are intertwined — neutrons with bombs, for example, or magnetism with X-rays with DNA. So here, without any deep meaning to the order of presentation, are the Top 10 Science Anniversaries in 2020:
10. Roger Bacon, 800th birthday
Nobody knows for sure exactly when Bacon was born, but a passage in his writings suggests that it was around 1220. He was among the premier natural philosophers of his day; he studied first at Oxford and then lectured at the University of Paris. He became a Franciscan monk but often got in trouble for breaking the order’s rules.
Bacon was among the first to advocate for the importance of experiment in investigating nature. He especially emphasized the status of optics as a fundamental science. Bacon also understood the necessity of applying math when explaining natural phenomena. “The power of mathematics is capable of unfolding the causes of all things, and of giving a sufficient explanation of human and divine phenomena,” he wrote. Bacon thought that many big-name philosophers of his era were dolts, but revered the philosopher-theologian Robert Grosseteste, and developed some of his ideas more fully, including the role of mathematics and the notion that “laws of nature” governed natural phenomena.
9. Bose-Einstein condensate, 25th anniversary
No scientist has made more news after their death than Albert Einstein. From lasers to black holes to gravitational waves, multiple major modern discoveries have merely verified predictions from decades earlier rooted in Einstein’s imagination. One such example came in 1995 when physicists produced a new weird wavy form of matter called a Bose-Einstein condensate. In this case Einstein’s imagination was inspired by the Indian physicist Satyendra Bose.
In 1924 Bose sent Einstein a paper describing (mathematically) light as a gas of particles (what we now call photons). Around that time Einstein read a paper by Louis de Broglie contending that matter particles (such as electrons) could be construed as waves. Einstein mashed up de Broglie with Bose and ended up describing matter with Bose’s math. Einstein envisioned wavy “boson” atoms that would merge into a kind of cloud of unified matter.
Making such a Bose-Einstein condensate cloud requires special conditions (it must be extremely cold, for one thing), and it took seven decades before physicists overcame the technical challenges and proved Einstein right, once again.
8. The Great Debate, centennial
Forget politics, the greatest debate of the 20th century took place on April 26, 1920, when astronomers Harlow Shapley and Heber Curtis faced off at the Smithsonian Museum of Natural History in Washington, D.C. Or at least that is the standard scientific lore.
Actually, the debate was pretty boring. Shapley read a paper about the current understanding of the Milky Way galaxy, which he believed to constitute the whole universe. Curtis read a paper contending that spiral-shaped nebulae visible through telescopes were in fact distant island universes comparable to the Milky Way. The winner of the debate was not announced until 1924, when Edwin Hubble showed that Curtis was right. Shapley conceded and for a while referred to the new cosmos of multiple galaxies as a multiverse.
7. Discovery of electromagnetism, bicentennial
Usually it’s not a good idea to play around with electricity. But two centuries ago, scientists didn’t know very much about it and curiosity got the better of them. Good thing, because that curiosity led to a discovery of unexaggeratable importance for the future of civilization.
A first step was Alessandro Volta’s primitive battery, invented in 1800. It launched a frenzy of electrical experimentation. Over the next 20 years many researchers investigated possible links between electricity and magnetism. Among them was Hans Christian Oersted at the University of Copenhagen, a chemist-physicist who had originally been trained as a pharmacist. Oersted had long suspected that electricity and magnetism shared a deep unity. During a lecture in the spring of 1820, he noticed that a current caused a nearby compass needle to move.
By July Oersted had conducted (get it?) thorough experiments enabling him to announce the discovery of electromagnetism — the generation of a magnetic emanation outside a wire carrying an electric current. About a decade later Michael Faraday showed the opposite, that moving a magnet around a wire induces an electric current. That established the principle behind electric power generation on large scales.
6. Discovery of X-rays, 125th anniversary
When Wilhelm Röntgen discovered X-rays in 1895, they were almost immediately put to use in medical practice. But they had a scientific significance just as great as their truly revolutionary importance for medicine.
For one thing, they bolstered the relatively recent realization that light was just one of several forms of electromagnetic radiation. (Only a few years earlier Heinrich Hertz had demonstrated the existence of radio waves, verifying James Clerk Maxwell’s suspicion that light was not the only form of electromagnetic waves.) “There seems to exist some kind of relationship between the new rays and light rays; at least this is indicated by the formation of shadows,” Röntgen wrote in his first report of the discovery. Ironically, later experiments on X-rays showed that electromagnetic “waves” sometimes behave as particles.
Eventually, X-rays transformed not only medicine but also astronomy and even biology, as they provided the tool that revealed the architecture of the molecules of life. See item 5.
5. Rosalind Franklin, 100th birthday
Franklin, born in July 25, 1920, in London, showed an early interest in science and trained as a chemist, becoming an expert on coal and other carbon-based materials. She earned a doctorate from the University of Cambridge in 1945. She then worked in Paris, developing skills at using X-ray crystallography to study crystalline structures, before moving to King’s College London, where Maurice Wilkins had been studying the molecular structure of DNA. Franklin took up DNA studies and produced exceptional X-ray images. She came close to determining DNA’s double-helix structure, but didn’t get it quite right.
Meanwhile James Watson, who had been following her research, was shown one of her X-ray images by Wilkins in early 1953, enabling Watson and Francis Crick to deduce the correct DNA architecture. Franklin saw that the Watson-Crick model was consistent with her work, but didn’t immediately accept that the model would ultimately turn out to be right in detail. She died in 1958, and so was not eligible for the Nobel Prize, awarded four years later to Watson and Crick. Wilkins also shared the prize, but there is no doubt that had she still been alive, Franklin would have deserved it more than he did.
4. John Graunt, 400th birthday
Born on April 24, 1620, in London, Graunt became a successful and influential merchant after taking over his father’s drapery business. Around age 40, for some reason he became interested in the weekly “Bills of Mortality” that enumerated deaths in the city. It occurred to him to also collect records of births and diseases to create tables that showed trends or patterns. He subjected the data to mathematical analyses, revealing insights such as women live (on average) longer than men, and death rates were higher in cities than rural areas.
Graunt’s work earned him election to the Royal Society, but the Great Fire of London in 1666 burned down his house, damaging his business and sending him straight into poverty. Graunt was later recognized as the pioneer of drawing scientific conclusions from the analysis of statistical information; his work is considered a cornerstone in the foundation of the modern sciences of statistics and demography.
3. Florence Nightingale, 200th birthday
Nightingale was born to a British family in Florence, Italy, (coincidence? no) on May 12, 1820. Her family moved back to England while she was still an infant. She is best known as the most famous nurse of the 19th century, the lady with a lamp. But she was also an innovative practitioner of applied statistics; she developed sophisticated statistical analyses to support her views on hygiene and health.
She went to nursing school in Germany, and in 1854 she led a team of nurses to aid wounded British soldiers during the Crimean War. Finding horrifyingly unsanitary conditions, she instituted a cleanliness regimen that reduced the death rate among hospitalized soldiers, and she returned to England to wide acclaim. She had single-handedly elevated the social status of the nursing profession and soon she started her own nursing school. She became an expert in interpreting health statistics, and her methods influenced the development of the science of epidemiology. She presented much of the statistical evidence for the benefits of proper health standards in graphical form, earning her a reputation as a pioneer of data visualization. (Her skill at communicating the statistical evidence was instrumental in getting policy makers to adopt her recommendations.)
Sadly, at age 38, she became mostly bedridden from a debilitating disease she had contracted during her Crimean War work. But she continued working from her home for decades, consulting with governments in various countries on how to best implement sanitation and other health-related policies.
2. Prediction of the neutron, centennial
After Ernest Rutherford discovered the atomic nucleus, in 1911, scientists spent years trying to understand how the nucleus was put together. It clearly required constituents with a positive electric charge. From later experiments Rutherford deduced that the basic nuclear particle carrying positive charge was identical to a hydrogen atom’s nucleus, and he named it the proton. Heavier atoms contained multiple protons.
But the number of protons needed to account for an atom’s mass gave the nucleus more positive electric charge than the negative charge of the atom’s orbiting (nearly massless) electrons. Since atoms are electrically neutral, it seemed that the nucleus must contain some electrons to cancel out the excess positive charge. Rutherford surmised that some of those electrons in the nucleus merged with protons to make a new particle that he later called the neutron. He considered it a new kind of atom, with zero electric charge. “In consequence it should be able to move freely through matter,” he said in a lecture delivered June 3, 1920, making it capable, physicists realized two decades later, of initiating nuclear fission chain reactions.
In 1932, experiments by the British physicist James Chadwick confirmed the existence of the neutron, surprising many physicists who had not believed Rutherford. But one scientist not surprised was the American chemist William Harkins, who had made a similar proposal and was actually first to use the term “neutron” in print, in 1921.
1. Atomic bomb, 75th anniversary
It’s hard to overstate the significance for science, or for all of history, of the atomic bomb, first exploded 75 years ago in July at Alamogordo, N.M. It represents a technological discontinuity comparable to the invention of electromagnetism, gunpowder or (the control of) fire itself. The atomic bomb’s main influence on society has been via its mere existence as a weapon in waiting, potentially ready to initiate Armageddon.
But it also still serves as a symbol of the power of science: Physicists probing the unseeable realm of the innards of atoms harnessed knowledge capable of destruction on a scale previously unimaginable. Applying the same knowledge to benefit society through the production of energy has not lived up to its advanced billing, through a combination of ineptness on the part of its proponents and lack of perspective on the part of its opponents. In any case, the bomb’s reminder of science’s importance to society will never fail to persist in the future, if any.