5 Ways to Screw Up Your A/B Test

By , July 31st, 2011 in A/B Testing | 14 comments

However much they espouse the power and value of A/B testing, so many business owners still get all twisted up in opinions. And it’s understandable. When a website is your baby, it’s difficult not to.

But, like a handful of other ailments, admitting you have a problem is the first step to recovery. You test (rather than decree) in order to nail down the most comfortable user experience for prospective customers, to make sure you’re not missing a big opportunity, and ultimately to create an environment most ripe for conversion.

I’ve seen fundamentally excellent A/B tests go awry in a handful of ways, and I’m going to outline some of them for you:

Screw-Up #1: Testing minutia instead of concepts

So you’ve got a brand new homepage. Don’t launch it and test whether the green button converts better than the blue one. You’ll get there – but first – think in broad strokes.

Test big-picture concepts. Test the Yahoo-style homepage (lots of information with a portal feel) against the Google-style homepage (you can do one thing, and one thing only). Test the big pretty picture against an all-text benefits statement. Then, once you’ve optimized the big stuff, work your way down to colors, fonts and photos.

Screw-Up #2: Getting caught up in opinions

Remember: if you feel strongly that A will outperform B – that’s just, like, your opinion, man. And opinions are like belly buttons. Everybody has one.

Your opinion doesn’t matter, so don’t spend time debating which version will perform better.

What matters is that you thoughtfully choose a few concepts that are substantially different from one another, test methodically and let the numbers speak for themselves.

It’s not about the highest paid person’s opinion. That’s what April Fool’s is for.

Screw-Up #3: Not verifying that your results are statistically conclusive

Your test might look like one version is clearly winning, but don’t stop your test until you verify the difference is statistically significant. Put away that TI-83 and try this handy tool.

Screw-Up #4: Comparing one week against the next

Countless times, I’ve heard CEOs and VPs at various startups suggest this: “We changed up the site on May 1, and the numbers over the last week are better than what we saw in the week leading up to the change.

This is not an A/B test.

It might be meaningful, but it’s not methodical. If you want to feel good about the validity of your test results, don’t compare one week against the next. Use Google Website Optimizer or a similar tool — there are plenty — and alternate your new version against the control.

Screw-Up #5: Not measuring the entire funnel

It’s true that your A/B test should have a single “goal” outcome: the user clicks to see the next page, submits a registration form, downloads a white paper, etc.

However: you would be wise to measure the impact of your test on the entire conversion funnel. The experience your users have with the page content you’re testing could very well influence them further down Conversion Drive.

Let’s take a very simple example: you’re testing using the word “free” vs. steering clear of it. Your landing page might get more clicks, suggesting the “free” test is successful — yet your users suddenly appear less likely to convert to a purchase. By using the word “free” you may have created, for the test group, a different expectation than that of the control group.

If you don’t measure the behavior of each user group, test and control, all the way down the conversion funnel, you’re running blind.

Smashing Magazine published a pretty good “Ultimate Guide to A/B Testing” last June with some tool suggestions and a handful of surprising test results. If you’re really tickled by that sort of thing, be sure to follow Which Test Won.

– Igor Belogolovsky

This is a guest post, entered in the 2011 Unbounce Conversion Fest Blogging Contest. All opinions are those of the author.

Igor Belogolovsky is a revenue optimization expert and cofounder of Clever Zebo, a group of web marketing strategy experts dedicated to helping interesting businesses get more customers online. While Igor loves to geek out about search engine marketing, conversion funnel optimization and partnership development, he’s also a pretty fierce snowboarder, musician and beer taster.

Comments

  1. Yomar Lopez says:

    You’re absolutely right about focusing on minuscule changes. It may be easier but it defeats the purpose of split-testing pages aimed at conversion, regardless of what you call them (conversion forms, action pages, landing pages, micropages, etc.). The key thing, as you mentioned, is shaping behaviors and managing expectations, while making it easier and more lÍkely for you “first dates” to ask for more.

    Getting those permissÍons or opt-ins is tough so you really have to stick with it and try different things.. Certainly for longer than a week or even a month. The more data you have, the easier it is to correlate with real results and actionable insight.

    Great A/B testing primer here!

  2. […] This post was Twitted by WordStream […]

  3. Samir says:

    wow… good job with the Split-Test calculator. This will be super handy

  4. Igor B. says:

    Yomar, Samir, thanks for the feedback!

  5. Kristi Hines says:

    Time comparisons usually don’t work, no matter what you’re testing, especially if you have multiple campaigns going on at the same time that are driving traffic. Also opinions can drive you bananas – I saw a forum post about “what do you think of my landing page” and it ended up a 50/50 split on some people liked one thing while others hated it. I don’t think that guy ever got good feedback on what to fix and then ended up leaving it alone.

  6. At one time or another, I think we have all been guilty of the above. I’m especially big on #5. I have found that some people will kill themselves over some details that do not increase conversions. Why?
    Thanks for the great post, Igor.

  7. Henrik says:

    thanks for the tip of the tool to calculate the best page

  8. […] 5 Ways to Screw Up Your A/B Test – Test the Yahoo-style homepage (lots of information with a portal feel) against the Google-style homepage (you can do one thing, and one thing only). […]

  9. Igor B. says:

    One more good resource to test statistical significance of your a/b test is http://getdatadriven.com/ab-significance-test

  10. […] 5 Ways to Screw Up Your A/B Test – Test the Yahoo-style homepage (lots of information with a portal feel) against the Google-style homepage (you can do one thing, and one thing only). […]

  11. Rahul says:

    “that’s just, like, your opinion, man” Nice Big Lebowski reference:) Also, Igor, thank you very much for the advice. I am new to A/B and multivariate testing. As a biochemist I’m drawn to numbers and data so this is right up my alley. I can’t wait to start testing!

  12. Hi

    just found this post, and very usefull in many ways, and will now follow you on this blog

    thanks

  13. Arise.io says:

    One way to succeed your a/b testing is to register to Arise.io :-).