Chapter One
Nothing Doing
[ THE ORIGIN OF ZERO ]
There was neither non-existence nor existence then; there
was neither the realm of space nor the sky which is beyond.
What stirred? Where?
—The Rig Veda
The story of zero is an ancient one. Its roots stretch back to the dawn of mathematics, in the time thousands of years before the first civilization, long before humans could read and write. But as natural as zero seems to us today, for ancient peoples zero was a foreign—and frightening—idea. An Eastern concept, born in the Fertile Crescent a few centuries before the birth of Christ, zero not only evoked images of a primal void, it also had dangerous mathematical properties. Within zero there is the power to shatter the framework of logic.
The beginnings of mathematical thought were found in the desire to count sheep and in the need to keep track of property and of the passage of time. None of these tasks requires zero; civilizations functioned perfectly well for millennia before its discovery. Indeed, zero was so abhorrent to some cultures that they chose to live without it.
Life without Zero
The point about zero is that we do not need to use it in the
operations of daily life. No one goes out to buy zero fish.
It is in a way the most civilized of all the cardinals, and
its use is only forced on us by the needs of cultivated
modes ofthought.
—Alfred North Whitehead
It's difficult for a modern person to imagine a life without zero, just as it's hard to imagine life without the number seven or the number 31. However, there was a time where there was no zero—just as there was no seven and 31. It was before the beginning of history, so paleontologists have had to piece together the tale of the birth of mathematics from bits of stone and bone. From these fragments, researchers discovered that Stone Age mathematicians were a bit more rugged than modern ones. Instead of blackboards, they used wolves.
A key clue to the nature of Stone Age mathematics was unearthed in the late 1930s when archaeologist Karl Absolom, sifting through Czechoslovakian dirt, uncovered a 30,000-year-old wolf bone with a series of notches carved into it. Nobody knows whether Gog the caveman had used the bone to count the deer he killed, the paintings he drew, or the days he had gone without a bath, but it is pretty clear that early humans were counting something.
A wolf bone was the Stone Age equivalent of a supercomputer. Gog's ancestors couldn't even count up to two, and they certainly did not need zero. In the very beginning of mathematics, it seems that people could only distinguish between one and many. A caveman owned one spearhead or many spearheads; he had eaten one crushed lizard or many crushed lizards. There was no way to express any quantities other than one and many. Over time, primitive languages evolved to distinguish between one, two, and many, and eventually one, two, three, many, but didn't have terms for higher numbers. Some languages still have this shortcoming. The Siriona Indians of Bolivia and the Brazilian Yanoama people don't have words for anything larger than three; instead, these two tribes use the words for "many" or "much."
Thanks to the very nature of numbers—they can be added together to create new ones—the number system didn't stop at three. After a while, clever tribesmen began to string number-words in a row to yield more numbers. The languages currently used by the Bacairi and the Bororo peoples of Brazil show this process in action; they have number systems that go "one," "two," "two and one," "two and two," "two and two and one," and so forth. These people count by twos. Mathematicians call this a binary system.
Few people count by twos like the Bacairi and Bororo. The old wolf bone seems to be more typical of ancient counting systems. Gog's wolf bone had 55 little notches in it, arranged into groups of five; there was a second notch after the first 25 marks. It looks suspiciously as if Gog was counting by fives, and then tallied groups in bunches of five. This makes a lot of sense. It is a lot faster to tally the number of marks in groups than it is to count them one by one. Modern mathematicians would say that Gog, the wolf carver, used a five-based or quinary counting system.
But why five? Deep down, it's an arbitrary decision. If Gog put his tallies in groups of four, and counted in groups of four and 16, his number system would have worked just as well, as would groups of six and 36. The groupings don't affect the number of marks on the bone; they only affect the way that Gog tallies them up in the end—and he will always get the same answer no matter how he counts them. However, Gog preferred to count in groups of five rather than four, and people all over the world shared Gog's preference. It was an accident of nature that gave humans five fingers on each hand, and because of this accident, five seemed to be a favorite base system across many cultures. The early Greeks, for instance, used the word "fiving" to describe the process of tallying.
Even in the South American binary counting schemes, linguists see the beginnings of a quinary system. A different phrase in Bororo for "two and two and one" is "this is my hand all together." Apparently, ancient peoples liked to count with their body parts, and five (a hand), ten (both hands), and twenty (both hands and both feet) were the favorites. In English, eleven and twelve seem to be derived from "one over [ten]" and "two over [ten]," while thirteen, fourteen, fifteen, and so on are contractions of "three and ten," "four and ten," and "five and ten." From this, linguists conclude that ten was the basic unit in the Germanic protolanguages that English came from, and thus those people used a base-10 number system. On the other hand, in French, eighty is quatre-vingts (four twenties), and ninety is quatre-vingt-dix (four twenties and ten). This may mean that the people who lived in what is now France used a base-20 or vigesimal number system. Numbers like seven and 31 belonged to all of these systems, quinary, decimal, and vigesimal alike. However, none of these systems had a name for zero. The concept simply did not exist.
You never need to keep track of zero sheep or tally your zero children. Instead of "We have zero bananas," the grocer says, "We have no bananas." You don't have to have a number to express the lack of something, and it didn't occur to anybody to assign a symbol to the absence of objects. This is why people got along without zero for so long. It simply wasn't needed. Zero just never came up.
In fact, knowing about numbers at all was quite an ability in prehistoric times. Simply being able to count was considered a talent as mystical and arcane as casting spells and calling the gods by name. In the Egyptian Book of the Dead, when a dead soul is challenged by Aqen, the ferryman who conveys departed spirits across a river in the netherworld, Aqen refuses to allow anyone aboard "who does not know the number of his fingers." The soul must then recite a counting rhyme to tally his fingers, satisfying the ferryman. (The Greek ferryman, on the other hand, wanted money, which was stowed under the dead person's tongue.)
Though counting abilities were rare in the ancient world, numbers and the fundamentals of counting always developed before writing and reading. When early civilizations started pressing reeds to clay tablets, carving figures in stone, and daubing ink on parchment and on papyrus, number systems had already been well-established. Transcribing the oral number system into written form was a simple task: people just needed to figure out a coding method whereby scribes could set the numbers down in a more permanent form. (Some societies even found a way to do this before they discovered writing. The illiterate Incas, for one, used the quipu, a string of colored, knotted cords, to record calculations.)
The first scribes wrote down numbers in a way that matched their base system, and predictably, did it in the most concise way they could think of. Society had progressed since the time of Gog. Instead of making little groups of marks over and over, the scribes created symbols for each type of grouping; in a quinary system, a scribe might make a certain mark for one, a different symbol for a group of five, yet another mark for a group of 25, and so forth.
The Egyptians did just that. More than 5,000 years ago, before the time of the pyramids, the ancient Egyptians designed a system for transcribing their decimal system, where pictures stood for numbers. A single vertical mark represented a unit, while a heel bone represented 10, a swirly snare stood for 100, and so on. To write down a number with this scheme, all an Egyptian scribe had to do was record groups of these symbols. Instead of having to write down 123 tick marks to denote the number "one hundred and twenty-three," the scribe wrote six symbols: one snare, two heels, and three vertical marks. It was the typical way of doing mathematics in antiquity. And like most other civilizations Egypt did not have—or need—a zero.
Yet the ancient Egyptians were quite sophisticated mathematicians. They were master astronomers and timekeepers, which meant that they had to use advanced math, thanks to the wandering nature of the calendar.
Creating a stable calendar was a problem for most ancient peoples, because they generally started out with a lunar calendar: the length of a month was the time between successive full moons. It was a natural choice; the waxing and waning of the moon in the heavens was hard to overlook, and it offered a convenient way of marking periodic cycles of time. But the lunar month is between 29 and 30 days long. No matter how you arrange it, 12 lunar months only add up to about 354 days—roughly 11 short of the solar year's length. Thirteen lunar months yield roughly 19 days too many. Since it is the solar year, not the lunar year, that determines the time for harvest and planting, the seasons seem to drift when you reckon by an uncorrected lunar year.
Correcting the lunar calendar is a complicated undertaking. A number of modern-day nations, like Israel and Saudi Arabia, still use a modified lunar calendar, but 6,000 years ago the Egyptians came up with a better system. Their method was a much simpler way of keeping track of the passage of the days, producing a calendar that stayed in sync with the seasons for many years. Instead of using the moon to keep track of the passage of time, the Egyptians used the sun, just as most nations do today.
The Egyptian calendar had 12 months, like the lunar one, but each month was 30 days long. (Being base-10 sort of people, their week, the decade, was 10 days long.) At the end of the year, there were an extra five days, bringing the total up to 365. This calendar was the ancestor of our own calendar; the Egyptian system was adopted by Greece and then by Rome, where it was modified by adding leap years, and then became the standard calendar of the Western world. However, since the Egyptians, the Greeks, and the Romans did not have zero, the Western calendar does not have any zeros—an oversight that would cause problems millennia later.
The Egyptians' innovation of the solar calendar was a breakthrough, but they made an even more important mark on history: the invention of the art of geometry. Even without a zero, the Egyptians had quickly become masters of mathematics. They had to, thanks to an angry river. Every year the Nile would overflow its banks and flood the delta. The good news was that the flooding deposited rich, alluvial silt all over the fields, making the Nile delta the richest farmland in the ancient world. The bad news was that the river destroyed many of the boundary markers, erasing all of the landmarks that told farmers which land was theirs to cultivate. (The Egyptians took property rights very seriously. In the Egyptian Book of the Dead, a newly deceased person must swear to the gods that he hasn't cheated his neighbor by stealing his land. It was a sin punishable by having his heart fed to a horrible beast called the devourer. In Egypt, filching your neighbor's land was considered as grave an offense as breaking an oath, murdering somebody, or masturbating in a temple.)
The ancient pharaohs assigned surveyors to assess the damage and reset the boundary markers, and thus geometry was born. These surveyors, or rope stretchers (named for their measuring devices and knotted ropes designed to mark right angles), eventually learned to determine the areas of plots of land by dividing them into rectangles and triangles. The Egyptians also learned how to measure the volumes of objects—like pyramids. Egyptian mathematics was famed throughout the Mediterranean, and it is likely that the early Greek mathematicians, masters of geometry like Thales and Pythagoras, studied in Egypt. Yet despite the Egyptians' brilliant geometric work, zero was nowhere to be found within Egypt.
This was, in part, because the Egyptians were of a practical bent. They never progressed beyond measuring volumes and counting days and hours. Mathematics wasn't used for anything impractical, except their system of astrology. As a result, their best mathematicians were unable to use the principles of geometry for anything unrelated to real world problems—they did not take their system of mathematics and turn it into an abstract system of logic. They were also not inclined to put math into their philosophy. The Greeks were different; they embraced the abstract and the philosophical, and brought mathematics to its highest point in ancient times. Yet it was not the Greeks who discovered zero. Zero came from the East, not the West.
The Birth of Zero
In the history of culture the discovery of zero will always
stand out as one of the greatest single achievements of the
human race.
—Tobias Danzig, Number:]BRK The Language of Science
The Greeks understood mathematics better than the Egyptians did; once they mastered the Egyptian art of geometry, Greek mathematicians quickly surpassed their teachers.
At first the Greek system of numbers was quite similar to the Egyptians'. Greeks also had a base-10 style of counting, and there was very little difference in the ways the two cultures wrote down their numbers. Instead of using pictures to represent numbers as the Egyptians did, the Greeks used letters. H (eta) stood for hekaton: 100. M (mu) stood for myriori: 10,000—the myriad, the biggest grouping in the Greek system. They also had a symbol for five, indicating a mixed quinary-decimal system, but overall the Greek and Egyptian systems of writing numbers were almost identical—for a time. Unlike the Egyptians, the Greeks outgrew this primitive way of writing numbers and developed a more sophisticated system.
Instead of using two strokes to represent 2, or three Hs to represent 300 as the Egyptian style of counting did, a newer Greek system of writing, appearing before 500 BC, had distinct letters for 2, 3, 300, and many other numbers (Figure 1). In this way the Greeks avoided repeated letters. For instance, writing the number 87 in the Egyptian system would require 15 symbols: eight heels and seven vertical marks. The new Greek system would need only two symbols: [Pi] for 80, and [Zeta] for 7. (The Roman system, which supplanted Greek numbers, was a step backward toward the less sophisticated Egyptian system. The Roman 87, LXXXVII, requires seven symbols, with several repeats.)
Though the Greek number system was more sophisticated than the Egyptian system, it was not the most advanced way of writing numbers in the ancient world. That title was held by another Eastern invention: the Babylonian style of counting. And thanks to this system, zero finally appeared in the East, in the Fertile Crescent of present-day Iraq.
At first glance the Babylonian system seems perverse. For one thing the system is sexagesimal—based on the number 60. This is an odd-looking choice, especially since most human societies chose 5, 10, or 20 as their base number. Also, the Babylonians used only two marks to represent their numbers: a wedge that represented 1 and a double wedge that represented 10. Groups of these marks, arranged in clumps that summed to 59 or less, were the basic symbols of the counting system, just as the Greek system was based on letters and the Egyptian system was based on pictures. But the really odd feature of the Babylonian system was that, instead of having a different symbol for each number like the Egyptian and Greek systems, each Babylonian symbol could represent a multitude of different numbers. A single wedge, for instance, could stand for 1; 60; 3,600; or countless others.
As strange as this system seems to modern eyes, it made perfect sense to ancient peoples. It was the Bronze Age equivalent of computer code. The Babylonians, like many different cultures, had invented machines that helped them count. The most famous was the abacus. Known as the soroban in Japan, the suan-pan in China, the s'choty in Russia, the coulba in Turkey, the choreb in Armenia, and by a variety of other names in different cultures, the abacus relies upon sliding stones to keep track of amounts. (The words calculate, calculus, and calcium all come from the Latin word for pebble: calculus.)
Adding numbers on an abacus is as simple as moving the stones up and down. Stones in different columns have different values, and by manipulating them a skilled user can add large numbers with great speed. When a calculation is complete, all the user has to do is look at the final position of the stones and translate that into a number—a pretty straightforward operation.
The Babylonian system of numbering was like an abacus inscribed symbolically onto a clay tablet. Each grouping of symbols represented a certain number of stones that had been moved on the abacus, and like each column of the abacus, each grouping had a different value, depending on its position. In this way the Babylonian system was not so different from the system we use today. Each 1 in the number 111 stands for a different value; from right to left, they stand for "one," "ten," and "one hundred," respectively. Similarly, the symbol ?? in ?? stood for "one," "sixty," or "thirty-six hundred" in the three different positions. It was just like an abacus, except for one problem. How would a Babylonian write the number 60? The number 1 was easy to write: ??. Unfortunately, 60 was also written as ??; the only difference was that ?? was in the second position rather than the first. With the abacus it's easy to tell which number is represented. A single stone in the first column is easy to distinguish from a single stone in the second column. The same isn't true for writing. The Babylonians had no way to denote which column a written symbol was in; ?? could represent 1, 60, or 3,600. It got worse when they mixed numbers. The symbol ?? could mean 61; 3,601; 3,660; or even greater values.
Zero was the solution to the problem. By around 300 BC the Babylonians had started using two slanted wedges, ??, to represent an empty space, an empty column on the abacus. This placeholder mark made it easy to tell which position a symbol was in. Before the advent of zero, ?? could be interpreted as 61 or 3,601. But with zero, ?? meant 61; 3,601 was written as ?? (Figure 2). Zero was born out of the need to give any given sequence of Babylonian digits a unique, permanent meaning.
Though zero was useful, it was only a placeholder. It was merely a symbol for a blank place in the abacus, a column where all the stones were at the bottom. It did little more than make sure digits fell in the right places; it didn't really have a numerical value of its own. After all, 000,002,148 means exactly the same thing as 2,148. A zero in a string of digits takes its meaning from some other digit to its left. On its own, it meant ... nothing. Zero was a digit, not a number. It had no value.
A number's value comes from its place on the number line — from its position compared with other numbers. For instance, the number two comes before the number three and after the number one; nowhere else makes any sense. However, the 0 mark didn't have a spot on the number line at first. It was just a symbol; it didn't have a place in the hierarchy of numbers. Even today, we sometimes treat zero as a nonnumber even though we all know that zero has a numerical value of its own, using the digit 0 as a placeholder without connecting it to the number zero. Look at a telephone or the top of a computer keyboard. The 0 comes after the 9, not before the 1 where it belongs. It doesn't matter where the placeholder 0 sits; it can be anywhere in the number sequence. But nowadays everybody knows that zero can't really sit anywhere on the number line, because it has a definite numerical value of its own. It is the number that separates the positive numbers from the negative numbers. It is an even number, and it is the integer that precedes one. Zero must sit in its rightful place on the number line, before one and after negative one. Nowhere else makes any sense. Yet zero sits at the end of the computer and at the bottom of the telephone because we always start counting with one.
One seems like the appropriate place to start counting, but doing so forces us to put zero in an unnatural place. To other cultures, like the Mayan people of Mexico and Central America, starting with one didn't seem like the rational thing to do. In fact, the Mayans had a number system—and a calendar—that made more sense than ours does. Like the Babylonians, the Mayans had a place-value system of digits and places. The only real difference was that instead of basing their numbers on 60 as the Babylonians did, the Mayans had a vigesimal, base-20 system that had the remnants of an earlier base-10 system in it. And like the Babylonians, they needed a zero to keep track of what each digit meant. Just to make things interesting, the Mayans had two types of digits. The simple type was based on dots and lines, while the complicated type was based on glyphs—grotesque faces. To a modern eye, Mayan glyph writing is about as alien-looking as you can get (Figure 3).
Like the Egyptians, the Mayans also had an excellent solar calendar. Because their system of counting was based on the number 20, the Mayans naturally divided their year into 18 months of 20 days each, totaling 360 days. A special period of five days at the end, called Uayeb, brought the count to 365. Unlike the Egyptians, though, the Mayans had a zero in their counting system, so they did the obvious thing: they started numbering days with the number zero. The first day of the month of Zip, for example, was usually called the "installation" or "seating" of Zip. The next day was 1 Zip, the following day was 2 Zip, and so forth, until they reached 19 Zip. The next day was the seating of Zotz'—0 Zotz' followed by 1 Zotz' and so forth. Each month had 20 days, numbered 0 through 19, not numbered 1 through 20 as we do today. (The Mayan calendar was wonderfully complicated. Along with this solar calendar, there was a ritual calendar that had 20 weeks, each of 13 days. Combined with the solar year, this created a calendar round that had a different name for every day in a 52-year cycle.)
The Mayan system made more sense than the Western system does. Since the Western calendar was created at a time when there was no zero, we never see a day zero, or a year zero. This apparently insignificant omission caused a great deal of trouble; it kindled the controversy over the start of the millennium. The Mayans would never have argued about whether 2000 or 2001 was the first year in the twenty-first century. But it was not the Mayans who formed our calendar; it was the Egyptians and, later, the Romans. For this reason, we are stuck with a troublesome, zero-free calendar.
The Egyptian civilization's lack of zero was bad for the calendar and bad for the future of Western mathematics. In fact, Egyptian civilization was bad for math in more ways than one; it was not just the absence of a zero that caused future difficulties. The Egyptians had an extremely cumbersome way of handling fractions. They didn't think of 3/4 as a ratio of three to four as we do today; they saw it as the sum of 1/2 and 1/4. With the sole exception of 2/3, all Egyptian fractions were written as a sum of numbers in the form of 1/n (where n is a counting number)—the so-called unit fractions. Long chains of these unit fractions made ratios extremely difficult to handle in the Egyptian (and Greek) number systems.
Zero makes this cumbersome system obsolete. In the Babylonian system—with zero in it—it's easy to write fractions. Just as we can write 0.5 for 1/2 and 0.75 for 3/4, the Babylonians used the numbers 0;30 for 1/2 and 0;45 for 3/4. (In fact, the Babylonian base-60 system is even better suited to writing down fractions than our modern-day base-10 system.)
Unfortunately, the Greeks and Romans hated zero so much that they clung to their own Egyptian-like notation rather than convert to the Babylonian system, even though the Babylonian system was easier to use. For intricate calculations, like those needed to create astronomical tables, the Greek system was so cumbersome that the mathematicians converted the unit fractions to the Babylonian sexagesimal system, did the calculations, and then translated the answers back into the Greek style. They could have saved many time-consuming steps. (We all know how fun it is to convert fractions back and forth!) However, the Greeks so despised zero that they refused to admit it into their writings, even though they saw how useful it was. The reason: zero was dangerous.
The Fearsome Properties of Nothing
In earliest times did Ymir live:
was nor sea nor land nor salty waves,
neither earth was there nor upper heaven,
but a gaping nothing, and green things nowhere.
—The Elder Edda
It is hard to imagine being afraid of a number. Yet zero was inexorably linked with the void—with nothing. There was a primal fear of void and chaos. There was also a fear of zero.
Most ancient peoples believed that only emptiness and chaos were present before the universe came to be. The Greeks claimed that at first Darkness was the mother of all things, and from Darkness sprang Chaos. Darkness and Chaos then spawned the rest of creation. The Hebrew creation myths say that the earth was chaotic and void before God showered it with light and formed its features. (The Hebrew phrase is tohu v'bohu. Robert Graves linked these tohu to Tehomot, a primal Semitic dragon that was present at the birth of the universe and whose body became the sky and earth. Bohu was linked to Behomot, the famed Behemoth monster of Hebrew legend.) The older Hindu tradition tells of a creator who churns the butter of chaos into the earth, and the Norse myth tells a tale of an open void that gets covered with ice, and from the chaos caused by the mingling of fire and ice was born the primal Giant. Emptiness and disorder were the primeval, natural state of the cosmos, and there was always a nagging fear that at the end of time, disorder and void would reign once more. Zero represented that void.
But the fear of zero went deeper than unease about the void. To the ancients, zero's mathematical properties were inexplicable, as shrouded in mystery as the birth of the universe. This is because zero is different from the other numbers. Unlike the other digits in the Babylonian system, zero never was allowed to stand alone—for good reason. A lone zero always misbehaves. At the very least it does not behave the way other numbers do.
Add a number to itself and it changes. One and one is not one—it's two. Two and two is four. But zero and zero is zero. This violates a basic principle of numbers called the axiom of Archimedes, which says that if you add something to itself enough times, it will exceed any other number in magnitude. (The axiom of Archimedes was phrased in terms of areas; a number was viewed as the difference of two unequal areas.) Zero refuses to get bigger. It also refuses to make any other number bigger. Add two and zero and you get two; it is as if you never bothered to add the numbers in the first place. The same thing happens with subtraction. Take zero away from two and you get two. Zero has no substance. Yet this substanceless number threatens to undermine the simplest operations in mathematics, like multiplication and division.
(Continues...)