Quick Answer: A/B testing is when a website randomly shows half its users Version A, and the other half Version B, to see which one performs better mathematically. Amazon, Netflix, and Google run thousands of A/B tests daily to optimize every pixel on your screen.
The $300 Million Button
Tech companies don't guess; they test. A famous e-commerce site realized their "Register" button was stopping people from checking out. They A/B tested changing the button text to "Continue as Guest." It resulted in a 45% increase in purchaser retention, creating $300 million in new revenue that year. A single data-driven A/B test changed the company.
How Randomization Powers A/B Tests
The core of an A/B test is the random assignment. When you visit a website, an algorithm flips a virtual coin. Heads: you see a blue checkout button. Tails: you see a green one. Because the audiences are randomly divided, demographic variables cancel out. The only difference is the button color, allowing true cause-and-effect to be measured.
Google's 41 Shades of Blue
Google once famously couldn't decide which shade of blue to use for their search result links. The designers argued. The engineers stepped in and A/B tested 41 different shades of blue on 1% of live traffic. The highest-converting shade was selected, reportedly earning the company an extra $200 million a year in ad clicks.