How do you multiply using Roman numerals? How would you write the number 10,030 without using zero? A compact, place-based (or positional) number system with a symbol for zero opens the floodgates for arithmetic calculations and the discovery of new numbers.
With only 10 symbols, we have the machinery to describe new numbers that grow beyond our imagination. Here, we’ll explore the origins of zero and the development of our modern decimal system. With a powerful positional number system in place, humankind was finally equipped with the tools necessary to begin the development of modern mathematics.
Let’s begin with the downside of the ancient additive systems. Most of the systems required the repetition of symbols. For example, the Roman numerals XXIII equal 23, and they’d add up the two Xs (10 each) and then the three Is, and get 23. The Babylonians used dovetails and nails, which they would add up. Although computation with the additive systems was fast using tools such as the abacus, those systems required a very long list of symbols to denote larger and larger numbers, and this was a problem in practice.
This is a transcript from the video series Zero to Infinity: A History of Numbers. Watch it now, on The Great Courses.
Slow Progress for Heaps of Numbers
Additive systems made it difficult to look at more arithmetically complicated questions and thus slowed the progress of the study of numbers. In order to move to what we call a positional system, they needed a new number. This inspired a philosophical question: How many items do you see in an empty box? Is your answer a number? This is the question about zero. In the Rhind Papyrus from 1650 B.C.E., the scribe Ahmes referred to numbers as “heaps.” This tradition actually continued through the Pythagoreans, who in the 6th century B.C.E. viewed numbers as “a combination or heaping of units.”
This notion of having zero be a quantity didn’t make any sense at all because they were thinking in terms of heaps. This lack of zero caused many challenges.
Even Aristotle defined number as an accumulation or heap. Also, the word “three” derived from the Anglo-Saxon word throp, again meaning “pile” or “heap.” Well, because we can’t have a heap of zero objects—with zero objects, there would be no heaping at all—zero was not viewed as a number. So this notion of having zero be a quantity didn’t make any sense at all because they were thinking in terms of heaps. This lack of zero caused many challenges. A careless Sumerian scribe could cause ambiguities because, in cuneiform, different spacing between symbols can actually represent different numbers. The Egyptian system, on the other hand, did not require a placeholder like zero, but their additive notation was cumbersome. Again, they had all the symbols together, and they had to add them all up. As a result, in the 2,000 years of the Egyptian numeral system, they made very little progress in arithmetic or, more generally, in mathematics. It’s interesting to see how the notation really drives our understanding, our intuition, and our further quest to consider number.
Learn more about why all numbers are interesting
An Empty Placeholder Appears
The Mayans also had an eye-shaped symbol for zero that they also used only as a placeholder.
Zero first appeared as an empty placeholder rather than a number. The Babylonians had a symbol for zero by 300 B.C.E. It was a placeholder rather than a number because, again, they were thinking heapings, but they needed to distinguish between numbers. The Mayans also had an eye-shaped symbol for zero that they also used only as a placeholder. The evolution of the symbol for zero is actually very difficult to chart. The modern symbol “0” may have arisen from the use of sand tables that were used to calculate things, whereby pebbles would be placed in and moved back and forth for addition or subtraction. When a pebble would be removed, there would be an indentation or a dimple in the sand, which reflects the “0” that we see today. In fact, calculations performed on the sand tables may have actually led to the development of the place-based number systems.
Learn more about Zeno’s paradoxes of motion, space, and time
The Birth of the Zero
Later, in the 2nd century C.E., Ptolemy used the Greek letter omicron, which looks like an “O,” in fact, to denote “nothing.” So this is the symbol for zero, the “0” that we see—the circle. But I want to make it very clear that Ptolemy did not view this as a number, but merely as the idea of nothing. But you can see, again, that these things were slowly coming together. Zero as a number really occurred in India, most likely.
By the 7th century, the Indian astronomer Bhramagupta offered a treatment of negative numbers and actually understood zero as a number, not just as a placeholder. In fact, he actually studied 0 divided by 0, and 1 divided by 0, and he decided erroneously that 0 divided by 0 equals 0 but just didn’t know what to conclude about 1 divided by 0.
Here again we see a couple of things. First of all, we know today that we can’t divide by 0. If we divide by 0, it does not yield a number, so we leave the realm of number. So we can’t do that—no dividing by 0—and we learn that in school. But we also see a wonderful thing. Bhramagupta, this very important, great mind, was making a mistake, again—something that is to be celebrated rather than to feel embarrassed about. He didn’t get it quite right. That’s okay; his contributions were enormous. So finally, humankind expanded its view of number to actually include and embrace zero.
From Empty to Nothing to Zero
A few words about this “nothing” number in terms of language: from the 6th to the 8th centuries, in Sanskrit there was “sun-yah,” which meant “empty,” to represent zero as we think of it. By the 9th century, in Arabic there was “sigh-fr.” By 13th-century Latin, there was “zef-ear-e-um.” From 14th-century Italian there was “zef-ear-row.” By 15th-century English, we have “zero.” So we can see the evolution of just that word.
Learn more about how the paradoxes associated with infinity are infinite
Because of zero’s power in computation, some viewed it as mysterious and nearly magical. As a result, the word zero has the same origins as another word that means “a hidden or mysterious code,” and that word, of course, is “cipher.” We can see that “cipher” actually came from the mysterious qualities that zero possessed in the eyes of our ancestors.
Common Questions About the Number Zero
While it was used as a placeholder for millennia before, the number zero is officially thought to have been invented by Brahmagupta around the year 628, though this is still mostly scholarly conjecture.
The number zero is absolutely a natural number on the number line between positive and negative 1 and can be used in sets to identify numbers. However, as numbers are used to count and zero cannot count anything, it can also be considered not a number!
Technically, the number zero cannot be larger or smaller than itself like the number one or negative one can be, so it is neither. However, in set theory zero is in the set of non-negative numbers while also not being in the set of positive numbers. Zero is unique.