Maybe time’s arrow needs ergodicity as well as entropy

playing cards

If you take 10 cards, numbered 1 through 10, and arrange a system for swapping any two cards at random over and over again, every single possible numerical sequence of the cards will eventually appear. That’s ergodic. If you started with the cards in order (low entropy), the cards would become more and more disordered (that is, entropy increases) as time goes on. A new proposal suggests that the direction of time might require both entropy and ergodicity.

sserg_dibrova/iStockPhoto

For a technical scientific term, entropy is pretty popular. I mean, it was the title of an episode of Buffy the Vampire Slayer, after all. Search the Internet for “entropy” quotes and you’ll find them by everybody from Anton Chekhov and Vaclav Havel to Frank Herbert and Philip K. Dick.

What’s more, a fair number of ordinary citizens even have a roughly correct notion of what entropy means. But understanding entropy more deeply is not so simple. For that, you need to master a related vocabulary word: ergodic. Good luck finding famous quotes about that. (I tried quotes.net and received “We couldn’t find any quotes or authors matching ergodic.”)

Ergodic does have a fairly simple definition, though. It means something like “passing through all the possibilities.” To illustrate: Take 10 cards, numbered 1 through 10, and lay them out in any order. Arrange a system for swapping any two cards at random, over and over again. Every single possible numerical sequence of the cards will eventually appear. That’s ergodic.

Entropy’s common definition — messiness — also sounds simple. But while “messiness” may be a good enough definition for movie quotes, it’s not enough for the equations of thermodynamics. Entropy is related to messiness — or disorder — but it’s a particular kind of disorder, measured using the mathematics of probability. “Messy” in this case means more probable. Entropy is a measure of how probable an arrangement is.

In the case of the 10 cards, for instance, only one of the many possible arrangements will position them in ascending numerical order. It’s an unlikely arrangement, so entropy is therefore low. There are a whole bunch of ways to arrange them not in numerical order. So if you start with order, and then run the card swapping process, the cards are likely to get more and more out of order. In other words, entropy increases as time goes by, which is basically what physicists mean by “as time goes by.” Time’s direction — from past to future — seems to be required by the second law of thermodynamics. It states that entropy (in a closed system, with no input of energy) always increases — until it reaches equilibrium, the state of maximum possible entropy.

But whether this explanation for the direction of time really works has long been debated. Some experts remain unsatisfied. Andreas Albrecht, for instance. He thinks explaining the arrow of time might require both entropy and ergodicity.

Time flows forward today, the standard story says, because entropy was lower in the past. But for years, critics have noted a flaw in this reasoning. Pick a point in time with low entropy. Obviously, entropy would logically get higher in the future. But it’s also likely to have been higher in the past. Time’s arrow would therefore point both ways.

All this was thought through long ago by Ludwig Boltzmann, the 19th century Austrian physicist who figured out the probabilistic approach to thermodynamics. He decided that time really did flow in both directions. But in any one part of the universe, whoever lived there would define the flow of time to be toward the future, no matter which way it went. In most of the universe, there would be no time at all, because entropy had reached its maximum, meaning everything was in equilibrium. But deviations from complete equilibrium — fluctuations caused by chance — could drive some regions away from equilibrium to a lower entropy state, and time could get going again. So people just happen to occupy a curious corner of the universe with low enough entropy to allow time to march forward.

Nowadays Boltzmann’s view is mostly disregarded (in much the way many physicists of Boltzmann’s day disregarded his belief in the existence of atoms). Today most physicists blame the arrow of time on the origin of the universe. If the whole universe started out in a low-entropy state, time would flow the same direction everywhere, and is still flowing today, as the universe as a whole has not yet reached equilibrium.

Maybe that’s right, but it leaves some important questions unanswered, such as why the universe began in a low-entropy state. In other words, why would the universe begin with circumstances analogous to laying all those 10 cards out in ascending numerical order — a very unlikely arrangement, considering all the many other possibilities? Albrecht, a cosmologist at the University of California, Davis, thinks that maybe the old idea of a universe in equilibrium could solve that riddle.

Originally, advocates of cosmic inflation believed that it could explain how today’s universe could evolve from a non-special starting place. However things were arranged at the beginning, a sudden burst of exponential expansion would have smoothed everything out, while depositing a few wrinkles creating clumps of matter to grow into galaxies. There would be nothing special about these conditions, as they would have arisen more or less randomly. But if you want an arrow of time that goes forward, you need special initial conditions — that is, low entropy. Low entropy is an improbable — a special or atypical —condition, Albrecht points out.

There is a “tension between the need for low entropy and the wish for typicality” Albrecht writes in a recent paper. But there’s no such tension in an equilibrium universe.

“Ultimately, an equilibrium theory may prove to be our only hope to ‘explain’ the state of the universe using laws of physics,” Albrecht writes. “In a true equilibrium state, there is no notion of an ‘initial state.’ The system simply exists eternally, fluctuating into one state or another.”

Random fluctuations would cycle an equilibrium universe through all its possible states, ergodically. Eventually a low-probability, low-entropy situation would arise, and the universe we observe could evolve toward higher entropy with a forward pointing arrow of time.

See, ergodicity is simple.

Except that there’s one small problem. Statistically, most fluctuations from equilibrium will be small. If people aren’t special, it is very unlikely that you would find yourself in a real universe populated by billions of out-of-equilibrium stars and galaxies. That would be like winning a slot machine jackpot on your first spin of the wheels. It is much more likely that a small fluctuation would produce just your brain, in a configuration tricking it into thinking the rest of the universe also existed. You would, in other words, be what physicists call a Boltzmann brain.

That should be a compliment, since Boltzmann was a very smart guy. But it is also a bit disconcerting. Fortunately, there are good reasons to believe you are not a Boltzmann brain. Reading blogs, for instance, would be difficult, because your brain would randomly fluctuate in such a way that any one sentence would have no connection to the next one. (It’s tempting to insert a sentence from “Jabberwocky” here, but that might scare you.)

If you’re not a Boltzmann brain, then the equilibrium universe is in trouble again. Unless, Albrecht says, there is some unknown complication underlying reality that we don’t know about. It has to do with a subtle sleight of hand in the simple definitions of entropy and ergodicity.

Remember the 10 cards? Ascending numerical order can be achieved in only one way. But any specific order can be achieved in only one way. Entropy increases only when you don’t know what order the numbers are in. For a real physics example, think of the molecules in a box of a gas. You can’t measure the positions and velocities of all the molecules. You can just measure things like temperature and pressure. A whole bunch of different molecular arrangements can produce the same temperature or pressure. It’s like with slot machines. Various combinations of icons on the payline pay the same amount of money (usually, zero).

Entropy is really a measure of how many arrangements (microstates) produce a given macrostate (temperature/pressure). High entropy means many possible arrangements can produce the macrostate you observe, making it more probable to occur.

In ordinary physics, the microstates involve molecules or atoms or subatomic particles. You can’t observe them individually. You have to measure bunches of them, a notion known as coarse graining. It’s like playing slots with the payline too small to see — all you know after each spin is how much you won or lost. Laws of physics (the laws of thermodynamics, for instance) tell you how the ergodic motion of the unobservable particles relate to your coarse-grained observations.

Based on the laws of physics — those that relate microstates to macrostates — an equilibrium universe seems unlikely unless Boltzmann brains can write blogs. But suppose, Albrecht says, there are “trans-microstates,” a hidden layer of physics beneath the particles of matter and force we know about. In this hidden realm, ergodic fluctuations could be related to observables by new laws. Just like slot machines possess hidden algorithms that make sure you’ll rarely win the jackpot, nature might be hiding an algorithm that produces a low-entropy “jackpot” universe more often than Boltzmann brains.

Such lucky outcomes could be possible if ergodicity occurs only in the trans-microstates, not in the ordinary microstates of current physics. Those microstates would simply be coarse-grained descriptions of the more fundamental trans-microstates. At this deeper level, Albrecht speculates, ergodicity could make the real universe probable and typical.

Albrecht admits that he hasn’t yet figured out a way to demonstrate exactly how ergodicity in the hidden realm could produce the jackpot fluctuations. But give him a break. It wasn’t until the late 19th century that Boltzmann, building on the insights of James Clerk Maxwell, showed how the microstates of molecules were related to the macrostates of temperature and pressure. It’s not like this stuff is easy.

Besides, as Boltzmann emphasized, theories shouldn’t constrain themselves to facts already known. His assessment of his own theory about time and equilibrium might someday apply to Albrecht’s ideas too.

“The above-developed theory does indeed go boldly beyond our experience,” Boltzmann said of his probabilistic view of time. “But it has the merit which every such theory should have — of showing us the facts of experience in an entirely new light and of inspiring us to new thought and reflection.”

Follow me on Twitter: @tom_siegfried

Tom Siegfried is a contributing correspondent. He was editor in chief of Science News from 2007 to 2012 and managing editor from 2014 to 2017.