Every leading website is using A/B testing today. Why? Well, because they know everybody else is doing it too. Oh yeah, and they also read that article on the Internet saying that you can get +258% conversion rate by changing your blue button on your homepage into a red button. But the truth is a bit different. A lot of digital marketing specialists do A/B testing but get flawed results. Even the almighty Google, when they started A/B testing 15 years ago, got wrong results. Yes you heard me. Even Google got it wrong.
Here are some of the most basic mistakes beginners make.
1. You don't have enough traffic
To start using A/B testing, you really need to have enough traffic. Otherwise, the statistical significance of your tests will have too high a margin of error. As a rule of thumb, the higher the traffic and/or the higher the conversion number, the faster you can reach a good confidence level on your test result. In other words, the conversion gain you will see in the test result will be closer to the actual conversion gain you will observe later when the winning variation gets into production. In practice, for a B2C website, we recommend a minimum traffic of 100 000 unique monthly visitors. And for B2B, 35 000 unique monthly visitors is a reasonable threshold to keep in mind. Your tests should be launched on your whole traffic to be accurate... unless you have millions of visitors every month.
You also need at least 50 conversions per variation, and the test should last longer than a week. Your visitors' behavior is changing all the time, depending on weather, weekday or weekends, holiday...so make sure your tests last long enough to take into consideration all your users, and that no special event happened at that time.
2. You start testing with complicated tests
For your first tests ever, start simple. Being successful at A/B testing is all about process. So it's important that you first go through the motions. See how theory computes with reality, what works for you and what doesn't. Where you have problems, be it in the implementation of the tests, coming up with ideas, analyzing the results, etc… Think about how you'll scale your testing practice, or if you'll need new hires for example. Starting with A/B testing is a lot like starting weight training seriously. You don't start with maximum weight and complicated exercises.
Exactly the same goes for A/B testing. You start simple, focus all your attention on each step of the process, set up safeguards and adjust as you go so you don't have to worry about it later. Another benefit is that you'll get quick wins. Getting bummed out when your first tests fail (and most do, even when done by the best experts out there) is often why people give up or struggle to convince their boss/colleagues that split testing is indeed worth the time and investment. Starting with quick wins allows you to create momentum and rally people to the practice inside your team/company. You could be overwhelmed, get flawed results and get discouraged. Instead,you should start by simple tests like removing distraction on key pages: is this slider really necessary, or all these extra buttons? To find out what elements will have the greatest impact, talk to a consulting with A/B testing experience.
3. You ignore small lifts
"Go big or go home.” It's true that you should focus your tests on high impact changes first. That doesn't mean you can ignore small effects. Why? Why? Because of maths, that's why.
Let's take a quick example. If a test shows a variation winning with 5% more conversions, and each month you get similar results, that's an 80% improvement over a year. Small changes can add up! Also, the more you test, the more you'll improve your website, the less frequently you will see big results. Don't be sad if you just get small lifts. It could mean your website is good. It is pretty rare to get big lifts on a “normal” site. Don't be disheartened by small gains; they will pay off big over time.
Geber offers a full range of digital marketing and consulting services.