Quick Answer: The Law of Large Numbers states that as a random experiment is repeated more and more times, the observed average result will get closer and closer to the expected (theoretical) result. After 10 coin flips you might get 70% heads — but after 10,000 flips, you will be very close to 50%.
What Is the Law of Large Numbers?
The Law of Large Numbers (LLN) is one of the fundamental theorems of probability and statistics. It was formally proven by Swiss mathematician Jacob Bernoulli in 1713. It states that as the number of identical, independent trials increases, the sample average converges to the expected value.
Illustrated With Coin Flips
| Number of Flips | Typical Heads % | Distance from 50% |
|---|---|---|
| 10 | 30–70% | Up to ±20% |
| 100 | 42–58% | Typically ±8% |
| 1,000 | 48–52% | Typically ±2% |
| 10,000 | 49.5–50.5% | Typically ±0.5% |
| 1,000,000 | ≈50.000% | ≈0% |
What the Law Does NOT Say
- It does NOT say that individual short sequences will be balanced (they often are not)
- It does NOT mean that tails is "due" after many heads — this is the Gambler's Fallacy
- It does NOT apply to single trials — only to averages over many trials
- It describes long-run averages, not short-run outcomes
Real-World Applications
The Law of Large Numbers is the mathematical foundation of the insurance industry. Insurers cannot predict whether any individual person will make a claim, but over millions of policies they can predict the average claim rate with high accuracy. The same principle applies to casinos, which cannot predict individual game outcomes but can predict overall revenue with reliability across millions of games.
Strong vs Weak Law of Large Numbers
There are technically two versions of the LLN: the Weak Law (sample average converges in probability) and the Strong Law (sample average converges almost surely). For most practical purposes, the distinction is minor — both confirm that large samples behave predictably even when small samples are highly variable.