SANTA BARBARA, Calif. — The word “crisis” hung in the air from the very beginning of the meeting.
In a room just steps from the ocean in Santa Barbara, astronomers and physicists shifted restlessly in their chairs. Sunshine and sea breezes beckoned, but the scientists had cloistered themselves to debate one of the biggest quandaries in physics: how fast the universe is expanding.
Estimates based on exploding stars, or supernovas, had suggested that the universe is growing approximately 10 percent faster than indicated by light emitted just after the Big Bang, about 13.8 billion years ago. Now, a measurement based on observations of luminous objects called quasars had pushed the problem beyond a statistical benchmark known as five sigma, denoting that the issue was something to take seriously.
At the front of the room on July 15, two Nobel-winning physics titans debated the appropriate level of alarm. Cosmologist Adam Riess of the Space Telescope Science Institute in Baltimore queried theoretical particle physicist David Gross: How would particle physicists refer to a discrepancy this large?
“If we found something like this … we wouldn’t call it a tension or a problem, but rather a crisis,” said Gross, of the Kavli Institute for Theoretical Physics at the University of California, Santa Barbara.
Scientists at the meeting immediately adopted the word “crisis” to describe the difference in expansion estimates. But not everyone agrees that the problem is real. In physicists’ version of a mic drop, a study appeared online that evening, challenging the narrative. A new version of the supernova technique found a value of the Hubble constant, the parameter that quantifies the universe’s expansion, that was consistent with the measurements from the early universe. So — a crisis, or not yet?
Much is at stake, including scientists’ basic understanding of what the universe contains and how it evolves over time. So far, a theory known as the standard cosmological model has succeeded in explaining a wide variety of cosmic observations. But the discrepancy in measurements of the universe’s expansion could mean the model itself needs to be drastically altered.
If the impasse can’t be explained away by experimental error, says theoretical physicist Vivian Poulin of CNRS and Laboratoire Univers et Particules de Montpellier in France, “it would mean that there is really something very important that we do not understand from the very early universe.” If unknown phenomena occurred in the infant cosmos — such as a different type of dark energy or new subatomic particles — that could alter our understanding of how the universe has evolved, and possibly bring the two measurements back into agreement.
An expanding issue
One technique for measuring the current expansion of the universe is taking a “baby picture” of the cosmos and extrapolating to present day. That infant image is the cosmic microwave background, light emitted just 380,000 years after the Big Bang. To translate that information into an expansion rate of the modern-day universe, scientists use the standard cosmological model to make a prediction.
Using that strategy, scientists with the Planck experiment have estimated that the universe is expanding at a rate of 67.4 kilometers per second for each megaparsec, or about 3 million light-years, of distance between objects (SN: 3/21/15, p. 7). The number leaves little wiggle room for disagreement: The experimental error is only 0.5 km/s/Mpc.
But supernova measurements have settled on a larger expansion rate of 74.0 km/s/Mpc, with an error of 1.4 km/s/Mpc. That leaves an inexplicable gap between the two estimates. Now “the community has started to take this [problem] extremely seriously,” says cosmologist Daniel Scolnic of Duke University, who works on the supernova project led by Riess, called SH0ES.
It’s unlikely that an experimental error in the Planck measurement could explain the discrepancy: That prospect is “not a possible route out of our current crisis,” said cosmologist Lloyd Knox of the University of California, Davis, at the meeting. Plus, another technique with its basis in the early universe, the measurement of sound waves known as baryon acoustic oscillations — when combined with other measurements — agrees with the Planck result.
So worries have centered on the possibility that the supernova measurements contain unaccounted-for systematic errors — biases that push the SH0ES estimate to a larger value. “What keeps me awake at night is, what are the systematics that we might not know about when we only do one method?” says cosmologist Wendy Freedman of the University of Chicago.
Freedman took it upon herself to check.
Distance woes
To measure how fast the universe is expanding right now, scientists need to combine two bits of information: how fast distant objects appear to be receding from us, and how far away they are. The first is relatively easy. Scientists look for a redshift, a stretching of the wavelengths of light emitted by an object.
Measuring distances is much trickier. Astronomers employ “standard candles,” celestial objects that emit a consistent, quantifiable brightness, such as explosions of a supernova variety called type 1a. As with a real candle, if an object’s brightness is known, we can determine how far away it is by how much it has dimmed due to distance.
Setting the distance scale requires a “distance ladder,” using nearby objects of uniform brightness as a bridge to supernovas farther away. As one rung of that distance ladder, the SH0ES team uses stars known as Cepheids, which regularly vary in luminosity in a way that allows scientists to estimate their overall brightness.
To check the previous supernova results, Freedman threw out that distance ladder. Instead of Cepheids, she and colleagues used stars called red giants, which, at a certain stage in their lives, achieve a maximum brightness that is the same for each star. The result is “completely different from the ground up,” Freedman says.
In a well-timed, dramatic flourish that rocked the gathering of normally unflappable researchers, Freedman unveiled her team’s result in a talk on the second day of the meeting, as well as in a study accepted in the Astrophysical Journal. The result fell squarely in between the conflicting estimates from SH0ES and Planck, at 69.8 km/s/Mpc. With a calm firmness, Freedman pushes back on declarations of a crisis, saying that her team’s result should cause researchers to pause (SN Online: 7/16/19).
Holy cow, H0LiCOW
But even as Freedman’s revelation weakened the case for crisis, momentum toward its declaration had already been building. Just a few days before the meeting, the H0LiCOW collaboration posted two studies on arXiv.org about a measurement of the Hubble constant based on gravitational lensing of quasars, bright sources of light powered by a supermassive black hole at the center of a galaxy.
Just like a lens, massive objects can bend light’s path. The researchers looked at quasars that had been split into multiple images by such gravitational lenses, making one quasar look like two or more. The phenomenon is similar to the doubled image of a fish you might see as it swims near the corner of a fish tank.
Studying how those quasar images flicker resulted in a Hubble constant of 73.3 km/s/Mpc, supporting the idea of a crisis. “It seems like this is more real after our result,” says cosmologist Geoff Chih-Fan Chen at the University of California, Davis.
Crucially, the researchers did their work “blind,” meaning that they hid the answer from themselves until the analysis was finalized. This technique can prevent an unintended tendency for analyzers to align their result with previously measured values of the Hubble constant. Despite that blinding, the result was like an “echo” of the SH0ES result, Chen says.
Meanwhile, astronomer Mark Reid of the Harvard-Smithsonian Center for Astrophysics reported a Hubble constant measurement based on megamasers — clouds of gas swirling around a black hole that emit light of a particular wavelength, akin to a laser’s. That estimate was likewise in line with the higher set of values, about 74 km/s/Mpc.
Also presented at the meeting were measurements based on variations in the brightness of a galaxy across the pixels of an image (76.5 km/s/Mpc), and another variation on the supernova technique, which used stars called Miras instead of Cepheids or red giants (73.6 km/s/Mpc).
Meanwhile, another cosmological puzzle is increasingly garnering attention, says Hendrik Hildebrandt of Ruhr-Universität Bochum in Germany. There are hints of disagreements in measurements of the clumpiness of matter in the universe, as measured by a parameter known as sigma-8. To detect this clumpiness, scientists survey the sky, looking for a weak variety of gravitational lensing, in which galaxies appear to be slightly aligned with each other. This lensing can be used to infer the distribution of mass in the universe. But, much like the Hubble constant measurement, that number, as measured by an effort called the Kilo-Degree Survey, disagrees with estimates based on the cosmic microwave background.
“The sigma-8 tension is the second question mark that we have,” Hildebrandt says. But the discrepancy is not as significant, he notes, and it hasn’t been studied as closely. “It’s not as mature; it hasn’t been in the literature for as long. But it also doesn’t seem to really go away.”
Early vs. late
Estimates of the universe’s expansion rate based on physics of the early universe tend to have lower values than those based on the late universe, including type 1a supernovas (red) in conjunction with stars known as Cepheids, red giants and Miras. Other late universe estimates are based on objects called megamasers and quasars, and on how brightness varies within galaxies.
Estimates of how fast the universe is expanding don’t line up
Source: Vivien Bonvin
Some scientists are wondering if there’s any connection between this possible discrepancy and the expansion rate dilemma. If researchers determine that there is a second potential problem with their understanding of the universe, it would strengthen the case that something is wrong altogether.
The universe speaks?
If the discrepancies can’t be chalked up to measurement error, a new theory will be needed that is consistent with all the data. But scientists have struggled to find a cohesive explanation. Almost any tweak to the universe’s history — adding in new types of subatomic particles, for example — would conflict with other measurements, throwing physics into turmoil.
“We have so many different ways of probing the universe that it’s very hard to come up with an elegant theory that passes all the tests without creating new tensions,” says Dillon Brout of the University of Pennsylvania.
One potential solution involves an addition to the mysterious dark energy that is causing the universe’s expansion to accelerate. An “early dark energy” could have acted in the universe’s youth, altering the expansion around the time the cosmic microwave background was released, Poulin and colleagues reported at the meeting and in the June 7 Physical Review Letters.
And disagreement over the Hubble constant has precedent: The estimate has a history of confusing results, says Lucas Macri of Texas A&M University in College Station. In those earlier cases, “the universe was trying to tell you that you didn’t have the whole picture.” In one case, for example, some stars seemed older than the age of the universe. The resolution eventually came in the revelation of the existence of dark energy.
After days of discussion, all the evidence had been tallied. The organizers asked for a show of hands: Should the Hubble constant woes be called a tension, a problem or a crisis? Cosmologists, it turned out, were a little hesitant to throw out what they thought they knew about the universe. Only a smattering of hands went up for “crisis.” Arguments that a solution might be found without revamping physics seemed to have held sway.
Still, if the Hubble constant puzzle persists, that could mean the universe is once again trying to speak up.