Faster than a speeding gluon, more powerful than a nuclear blast, able to crunch data in colossal bursts? It’s the ultimate laptop, envisioned by Seth Lloyd of the Massachusetts Institute of Technology, and it stretches the laws of physics to their limits.
“Computers are physical systems,” Lloyd contends. “The laws of physics dictate what they can and cannot do.” Lloyd invokes a combination of relativity theory, quantum mechanics, and the laws of thermodynamics to elucidate these outer limits. He describes the result—his hypothetical ultracomputer—in the Aug. 31 Nature.
For the last 4 decades, steady improvements in manufacturing technology have allowed circuitry to be packed ever more tightly onto silicon chips, doubling computer power every 18 months or so. Lloyd decided to find out what the fundamental constants of nature—the speed of light, Planck’s constant, and the gravitational constant—have to say about how far miniaturization can proceed.
Lloyd assumed that his ultimate digital computer would be roughly the size of a conventional laptop, weighing about 1 kilogram and occupying 1 liter of space. Its speed depends on how much energy is available to run it. The fastest design, which is particularly impractical, is one that converts the computer’s mass entirely to energy as it makes its first and only run. Quantum mechanics and the uncertainty principle determine how that energy is used by chip components to flip rapidly between two states, representing bits of information. Lloyd estimates that his ultimate laptop would perform at a blazing 1051 operations per second. In comparison, today’s state-of-the-art computer chips lumber along at a sedate 1013 operations per second.
The upper limit on a computer’s speed would apply whether the technology involves vacuum tubes, transistors, electrons, quarks and gluons, or something even more exotic, Lloyd remarks.
Another major consideration is memory. “The amount of information that a physical system can store and process is related to the number of distinct physical states that are accessible to the system,” Lloyd says. A physical quantity known as entropy, which measures a system’s degree of disorder, quantifies this relationship. A high entropy means a large number of different states are available for storing information. Obtaining the maximum entropy requires converting mass to energy in order to create an information-packed memory, which would have the characteristics of a desktop thermonuclear explosion. Lloyd estimates that the resulting memory capacity would amount to 1031 bits. Current laptops store 1010 bits.
Because the computer’s mass can be consumed only once, there’d be a trade-off between speed and memory. Shrinking a 1-kg computer offers additional benefits. It would take less time for signals to travel from one side of the computer to the other, for example. Lloyd theorizes that compressing the laptop to a tiny fraction of the size of a proton would force its collapse into a miniature black hole. Current theories suggest that this ultradense object could store and process information at incredibly high rates—and, coincidentally, the time required to communicate across the computer would equal the time needed to flip a bit from one state to another.
Lloyd concedes, “There is no guarantee that these limits will ever be attained, no matter how ingenious computer designers become.” Even quantum computers, which already operate at computational limits set by physics (SN: 8/26/00, p. 132: Computation Takes a Quantum Leap), are much slower and process much less information than Lloyd’s ultimate laptop because their energy is locked up largely in mass.
Thermonuclear explosions show that it’s possible to unlock this energy, Lloyd says. “But controlling such an ‘unlocked’ system is another question,” he notes. That’s something for engineers and computer designers to ponder, now that they have a better idea of what opportunities might lie ahead.