Happy Birthday to Boole, with 11001000 binary candles
It’s almost time to celebrate George Boole’s 200th birthday. Or maybe we should call it Birthday No. 11001000.
You might have a hard time converting that binary number to decimal in your head. But it would be a snap for computers, which store numbers and manipulate them exclusively as 1s and 0s. Of course, everybody knows that. But not so many know that you have Boole to blame. Or thank.
Boole was the mathematician with a vision for transforming logical thinking into mathematical formulas. His book An Investigation of the Laws of Thought, published in 1854, established him as one of the most creative mathematical thinkers of his century. The mathematician-philosopher Bertrand Russell, in fact, considered Boole’s book to be “the work in which pure mathematics was discovered.”
Boole started young. Born on November 2, 1815, in Lincoln, England, he grew up without much education but an abundance of intelligence. During his youth he spent a lot of time in libraries, mastering everything he could about math. Before he turned 20 he was widely known in Lincolnshire as an expert on Newton. Soon Boole began publishing papers in math journals; at age 34, he got the job as math professor at Queen’s College, Cork in Ireland, even though he had never earned a degree in math, or in anything else.
Boole’s most creative idea was to express true and false propositions by numbers, and then crunch the numbers algebraically to distinguish correct from incorrect deductions. In his famous book he worked out the details of his ambition to merge logic with mathematics.
Boole’s book is not light reading. He explains his reasoning in rather tedious detail. But that reasoning was powerful enough to perceive the role that binary numbers — numbers expressed using only the two digits 0 and 1 — could play in representing human thought algebraically.
He began with the notion that various concepts could be represented mathematically by algebraic symbols. For instance, x could represent “men.” Then y could represent “all white things.” To refer to all nonwhite men, then, you could use the expression x minus y.
Various expressions representing other concepts could be mathematically combined to reach irrefutable logical conclusions, Boole believed.
He expended considerable intellectual effort to formulate the rules for making such computations. One rule, for instance, designated the meaning of multiplication: multiplying two symbols yielded all the members of the group that met the definition for both of the symbols.
So assigning x to all men and y to white things implies that x times y equals all white men. That seems reasonable enough. But Boole was astute enough to realize that his multiplication rule could pose problems. If you multiplied x times x, for instance — all men times all men — the answer would be “all men.” In other words, x times x, or x squared, equaled x. At first glance, that appears to be a logical inconsistency, kind of unsatisfactory when your whole plan was to implement a foolproof mathematical plan for doing logic. But Boole thought more deeply and saw that sometimes, x squared does equal x. And in fact, x squared always equals x for two numbers: 1 and 0. Voilà! Computing logical relationships algebraically — the sort of things computers do today — works just fine if you use binary numbers, 1 and 0, in the calculations.
Boole developed from this insight the system of “Boolean logic” that (with the embellishments of many mathematicians who followed him) underlies the logic gates used in modern computer chips, relying on the “truth values” of 1 for true and 0 for false.
Besides his work on logic, Boole wrote extensively on the (not unrelated) mathematics of probability. He eventually wanted to combine his insights into a more fully developed philosophy based on a deep understanding of how math and logic captured human thought.
Boole’s most ambitious goals were never completely realized, of course. Computers now are sometimes portrayed as “thinking machines” that embody Boole’s logical principles. But even today’s best computers can’t outthink the best humans (although computers are faster, and certainly beat the hell out of the worst humans).
Besides that, there is the problem that logic, too rigidly applied, can lead to tragedy. Boole, for instance, died young, at age 49, after walking two miles through a rainstorm and then lecturing while still all wet. That was followed by pneumonia. Perhaps the bedridden Boole might have survived. But his wife, applying the logical principle that a remedy resembles its cause, doused his bed with bucketfuls of water. (At least that’s the legend.) Consequently, Boole did not live to see 110010.
Follow me on Twitter: @tom_siegfried