The goal of probability is to describe what is to be expected from randomness. But randomness can be confusing to the human mind, because its expression is often quite different than what we expect it might be.
Imagine an experiment in randomness. Take a coin and flip it 200 times, and each time record whether it’s a heads or a tails, putting down Hs for the heads and Ts for the tails. Now, suppose you ask a person to just write down a random list of 200 Hs and Ts, and you put up both lists on a blackboard, one made by actually flipping a coin, and the other made by a human. Even though they may both look like an ocean of Hs and Ts, there is a way to tell which one is truly random, and which is human generated.
Learn more: Our Random World—Probability Defined
The thing to do is look for strings of long sequences where there are all Hs in a row or all Ts in a row. In the 200 Hs and Ts generated by randomly flipping a coin, you might see at least four or five long sequences of Hs or Ts: six Hs in a row here, five Ts there—a lot of streaks of many things in a row.
How often will a human being write more than four strings of the same letter in a row when they’re trying to be random?
Now consider the list generated by the human being. How often will a human being write more than four strings of the same letter in a row when they’re trying to be random? Well, we sort of resist this, because we don’t think that’s very random. They think you’ve got to sort of alternate—H-T-H-T—and so here in a human generated one you would see very few strings of Hs and Ts in a row.
As a matter of fact, when you flip a coin 200 times, the probability of having at least one string of six or longer of Hs or Ts is roughly 96 percent—very likely. The probability of having at least one string of five is 99.9 percent—it’s essentially certain. You’d be very unlikely to flip a coin that many times without getting these long strings, and if you actually simulate this on the computer, you’ll see that this plays out, that you just almost always get long strings.
Expectations of Randomness: An Experiment
One of the common misconceptions that a lot of people have about randomness is illustrated by the coin flipping experiment. Let’s say that you flip a coin many times, and just randomly it happened that 10 times in a row you got heads. Well, doesn’t it seem like the next time it’s more apt to be a tails? It does to most people. And the answer is, of course, that the coin doesn’t know what it’s just done. To the coin, every flip is a new flip, and it’s just as likely to be a heads as a tails after it’s done 10 heads in a row, as it was to get a heads than a tails if it had done none of them.
Take a coin, and more than a million times, you flip the coin 11 times. Obviously you do this with a computer.
To demonstrate this, you can simulate the following experiment. Take a coin, and more than a million times, you flip the coin 11 times. Obviously you do this with a computer. Computers are great, by the way; they don’t care—a million times, they’ll just go ahead and do it. So you just do it a million times, and what do you get? To make it easy, you actually flip the coin 11 times for 1,024,000 times, because every 1,024 times is the probability of getting 10 heads in a row. In other words, if you do the experiment of flipping the coin 1,024,000 times, and each time you flip it 11 times, you expect that the first 10 will all be heads about 1,000 times.
Learn more: Probability Is in Our Genes
So you run the computer simulation a first time, and the number of times you get 10 heads in the first simulation is 1,008: extremely close to 1,000. What happened to the 11th coin? Well, 521 times it turned out to be a head also, and 487 times it turned out to be a tail. There’s no memory. Approximately half the time heads, half the time tails.
If you do it again, the first 10 might be heads 983 times, and then the 11th flip heads 473 times and tails 510 times. During a third experiment, 1,031 times it came out heads 10 times in a row, and of those, 502 had the next coin be a heads, and 529 a tails. The coin has no memory. After it’s gotten 10 heads in a row, it’s just as likely to be heads the next time as it was the first time you flipped that coin.
What Exactly is Rare?
There is another counterintuitive aspect of probability, and it’s really interesting to think about what is rare, and how we view rarity in probability. Suppose you got dealt the following hand: the two of spades, the nine of spades, the jack of clubs, the eight of spades, and the five of hearts. Well, it probably doesn’t strike you as an impressive hand, one you write home about, but it is. One out of 2,598,960—that’s the probability of getting that hand.
Now if you were dealt the ace, king, queen, jack, ten of spades—a royal flush in spades—what’s the probability of getting this royal flush in spades? Exactly the same—1 out of 2,598,960—and yet you would write home to your mother about this hand for sure. Your previous hand was just an average hand, and yet in your whole life of playing cards, you know what? You will probably never get that hand again, because its probability is almost zero—1 out of 2,598,960. So this is one of the counterintuitive concepts of probability: that rare events happen all the time, but you may not recognize them as significant.
Learn more: Probability Everywhere
Rare events absolutely happen by chance alone. The most-common rare event that you see mentioned in the newspapers every day is the lottery. The probability of winning the Powerball lottery is approximately 1 out of 146,000,000. This is the big multistate lottery in some states. One out of 146,000,000. That chance is so remote you’d think it would never happen; but it happens regularly. Why? Because a lot of people try. A lot of people buy random numbers and some of them then occasionally win. If you try something that’s rare often enough, then it will actually come to pass.
This concept—that rare things will actually happen if you repeat them enough and you look for them enough—was encapsulated in an observation that was first made by the astronomer Sir Arthur Eddington in 1929, and he was describing some features of the second law of thermodynamics. He wrote the following:
If I let my fingers wander idly over the keys of a typewriter it might happen that my screed made an intelligible sentence. If an army of monkeys were strumming on typewriters they might write all the books in the British Museum. The chance of their doing so is decidedly more favourable than the chance of the molecules returning to one half of the vessel.
The Bible Code Hoax
However, you can find patterns in random writing, and in fact an enterprising author made a lot of money a few years ago when he wrote The Bible Code. What the author of The Bible Code did was take the Bible, written in Hebrew, and find patterns of words by skipping a certain number of letters, and in that pattern of skips they would find words written out. One example was “Atomic holocaust Japan 1945.” He said that this was an example of how the Bible showed the future.
The truth is that this is just a matter of probability. If you take all possible sequences of different lengths, you can by randomness alone find surprising things, and just to demonstrate it, people debunking this analysis found patterns in War and Peace and so on. This is another challenging part of probability, namely that if you look for rare things but you have a lot of places to look, you’ll tend to find them.
These are some of the challenges of looking at and asking what is random in the world.