what is unbounce
CATEGORY
Our Next Webinar

Using Overlays to Get More On-Site Conversions

Unbounce Conversion Optimizer Angus Lynch will teach you how overlays work and when you should use them. Plus, find out which offers to include based on buyer intent.

DID YOU KNOW WE HAVE A PODCAST?

Become a Better Marketer. Anytime, Anywhere.

Listen and learn on the go with Unbounce’s Call to Action marketing podcast. Tune in and get inspired in the car, while you cook, or at the gym.

Double Your Win Rate with Hiten Shah’s Data-Based A/B Testing Process

a-b-testing-sixteen-candles
A/B testing is like sex in high school – everyone talks about it but no one is doing it. Image source.

It’s been said that A/B testing is like sex in high school – everyone talks about it, but no one is doing it.

Excuses range from lack of resources to not having enough traffic, but a lot of procrastination comes down to simply not knowing where to start. As inspiring as they are, run-of-the-mill blog posts like The Top 10 Things You Need to Be Testing NOW (#9 Will Shock You!) don’t exactly convert testing virgins into A/B testing addicts.

In our most recent Unwebinar, Hiten Shah of Crazy Egg and KISSmetrics explained that simply copying A/B testing ideas from others will get you nowhere:

Instead, you need a deliberate process in which you’re testing on a tight schedule, consistently improving conversions and continuously learning from your tests.

Hiten shared a four-step process that will not only help you formulate smarter hypotheses, but develop an intuition for future tests so you can generate bigger wins with less effort.

Sound good to you?

You can watch the full webinar recording here – or you can read on for an overview of his four-step process.

1. Find pages to test

Before you formulate your first hypothesis – before you even start thinking about your first A/B test – stop. Take a step back and identify which of your pages has the greatest potential for improvement.

To do this, Hiten recommended diving into your analytics and identifying which of your landing pages have the highest volume of traffic but a low conversion rate.

testing-graph-conversions-traffic
Your testing “sweet spot” is represented in the upper left portion of this graph – pages where you have low conversions but high traffic. Image source.

Don’t have enough traffic?

All of that sounds pretty straightforward, but what if you don’t have very much traffic?

A small sample size doesn’t reveal much actionable insight. And that means you can’t get to work at improving your conversion rate, right?

Not quite.

Hiten explained that running A/B tests and watching for the highest number of conversions (quantitative data) isn’t the only way to increase your conversion rate. Collecting qualitative data can also help you move the needle.

Here’s what Hiten recommended:

  1. Engaging in qualitative research such as user testing. Services such as Peek allow you to get a free five-minute video of a person using your landing page. Watching an unbiased person engage with your landing page is an easy way to understand where the issues are and give you an idea of where people are getting confused.
  2. Or… find ways to get more traffic. This probably isn’t what you wanted to hear, but Hiten explained that finding ways to get more traffic is an important part of conversion rate optimization.

2. Create a hypothesis

You shouldn’t just pull your hypothesis out of thin air.

At their core, successful A/B tests have one thing in common: a solid hypothesis derived from hours of research.

If that sounds daunting, don’t fret. Hiten shared a pretty straight-forward formula you can use to structure your next A/B testing hypothesis:

hypothesis-formula
When in doubt, use this formula to help write out your hypothesis: “If [variable], then [result] due to [rationale]. Image source.

But how exactly do you flesh out the formula above? Hiten explained that taking the time to conduct preliminary research will help you formulate stronger hypotheses that generate bigger wins.

Here are some research methods he recommended:

  • Ask prospects what’s preventing them from converting with services like Qualaroo
  • Ask customers what persuaded them to purchase with surveys
  • Use heat-mapping software such as CrazyEgg to get a visual representation of where people are clicking (without any impact on the user experience)

Collecting insights using those research methods will help you get a feel for what’s working and what’s not. From there, you can identify what needs to change and get to work on your first hypothesis.

Determine which test you should prioritize

On the webinar, Hiten shared a straightforward process for determining which tests you should prioritize. Coined by CRO expert Chris Goward, the PIE Framework helps you rate your hypotheses based on three criteria:

  1. Potential: How much improvement can be made?
  2. Importance: Are you sending PPC traffic to that landing page? Is it high-traffic?
  3. Ease: How easy will implementing the test be?

For each of the criteria, rank your hypothesis a score of 1-10. In the end, Hiten explained, you’re left with an objective number ranking that will help you determine what you should test first.

3. Start an experiment

Once you’ve done all the prep work, it’s time to don your mad scientist hat and blow some stuff up (ideally, your conversion rate).

But before you get over-excited and declare any one variation a winner, you need to…

Wait for statistical significance

You knew this was coming. If you’ve read any articles about A/B testing, then you understand that statistical significance is important. If you’ve heard that anything between 95%-99% is kosher, get this. The data scientists at KISSmetrics don’t settle for anything under 99%.

As Hiten explained, successful A/B tests require diligence and patience. If a test has good improvement early on, let it run as long as it needs to.

But be willing to cut your losses

If you have tests lined up and you don’t think your test will reach statistical significance, Hiten recommended cutting your losses:

If I see a test that’s having a marginal improvement and I don’t think it’ll hit significance, I’ll turn off the test and go back to the control.

In other words, if a variation just isn’t promising, pull the plug. Return to the control and move on to the next test.

4. Learn from the data

You’ve done your research, you’ve formulated a detailed hypothesis and it paid off. Your winning variation killed.

Nice one! Pat yourself on the back…. and then get back to work. As Hiten explained, your job isn’t done.

Write a post-mortem

For every A/B test you run, Hiten recommended writing a post-mortem.

Document everything – your hypothesis, all the data and a description of whether or not you saw positive results. Include your best guess as to why it worked (or didn’t).

For Hiten, having a summary of whether tests lost or won has multiple benefits:

  • You’ll have a report of findings that you can share with your team so they can learn from your experiments too.
  • Over time, you’ll have compiled a comprehensive catalogue of tests to refer to so you don’t repeat your A/B testing mistakes.
  • The mental exercise will stimulate your critical thinking and help you improve your natural intuition about whether future tests will do well.

This last benefit – improving your natural intuition – touches on a really important point that Hiten made about conversion rate optimization.

While looking at your A/B testing data is essential, Hiten explained that drawing from past experiences and trusting your intuition are just as important. CRO is a science, but it’s also an art.

Rinse, lather, repeat

The process that Hiten shared will help make your A/B testing more systematic.

If you haven’t yet started running A/B tests, you no longer have an excuse. The four-step process will help get you started – and it’ll motivate you to keep lining up your next great A/B test. In the words of Hiten:

If you’re not testing, you’re not learning. Always have a test running  – no matter how small.

Before you know it, you’ll be running smarter tests that bring you bigger wins with less effort.

Over to you – do you have a structured A/B testing process? How does it differ from Hiten’s?

— Amanda Durepos


a-b-testing-sixteen-candles

About Amanda Durepos
Amanda Durepos is Unbounce’s Blog Editor and an aspiring dog owner. Former gallery director and freelance blogger, she has a love for curating great content. Find her on Twitter: @amandadurepos
» More blog posts by
  • Hi Amanda,

    Awesome post. I have read a lot about A/B testing. Everyone is saying why you should be doing it on your website and what results you will get. They also give various examples to prove everything.

    And your post clearly explains how to do it rather than why to do it. You have covered almost every step. Especially the first step “FInd pages to test”!

    I am going to share it.

    Regards,

    • Amanda Durepos

      Thanks for your kind words Ashish. Everyone knows they should be doing it – getting started is the tricky part.

  • Hi Amanda,

    Awesome post. I have read a lot about A/B testing. Everyone is saying why you should be doing it on your website and what results you will get. They also give various examples to prove everything.

    And your post clearly explains how to do it rather than why to do it. You have covered almost every step. Especially the first step “FInd pages to test”!

    I am going to share it.

    Regards,

    • Amanda Durepos

      Thanks for your kind words Ashish. Everyone knows they should be doing it – getting started is the tricky part.

  • Amanda,

    In our experience (2+ years of conversion rate optimization), most inexperienced CRO’s skip data analysis (qual/quant) as well as Hiten’s step #2 (create a hypothesis) entirely and jump straight to “let’s try changing the button color” because it’s easy and exciting.

    We’d be happy to share our CRO strategy creation template as well if you think it’d be valuable for your readers. Give me a shout if you want to talk about it Amanda!

    – Mike

    • Amanda Durepos

      Definitely interested in seeing that! I’ll shoot you an email. :) Thanks, Mike.

  • Amanda,

    In our experience (2+ years of conversion rate optimization), most inexperienced CRO’s skip data analysis (qual/quant) as well as Hiten’s step #2 (create a hypothesis) entirely and jump straight to “let’s try changing the button color” because it’s easy and exciting.

    We’d be happy to share our CRO strategy creation template as well if you think it’d be valuable for your readers. Give me a shout if you want to talk about it Amanda!

    – Mike

    • Amanda Durepos

      Definitely interested in seeing that! I’ll shoot you an email. :) Thanks, Mike.

  • Thanks for pointing out Peek. As for building hypotheses: this is something little taught at universities or anywhere nowadays. I often hear people coming up with the craziest suggestions. In science, and A/B testing is science, formulating a hypothesis alone does not immediately tell me how to test it. Often tests are ambiguous as they may test something else instead but we never notice it. Too lengthy to get into here in detail: but whoever has a large number of items and pages to test should grab a book on the “Design of Experiments” and “Method of steepest ascent”.

    • Amanda Durepos

      Thank you for the recommendations Darragh. Will definitely have to check those out.

  • Thanks for pointing out Peek. As for building hypotheses: this is something little taught at universities or anywhere nowadays. I often hear people coming up with the craziest suggestions. In science, and A/B testing is science, formulating a hypothesis alone does not immediately tell me how to test it. Often tests are ambiguous as they may test something else instead but we never notice it. Too lengthy to get into here in detail: but whoever has a large number of items and pages to test should grab a book on the “Design of Experiments” and “Method of steepest ascent”.

    • Amanda Durepos

      Thank you for the recommendations Darragh. Will definitely have to check those out.

  • Great post, Amanda. I’d agree that the hardest part is always beginning. When we first started web development, it was tough getting our team to conduct simple A/B tests. But after sorting through pages from our analytics, we figured out which ones needed more attention. Although the websites are done, A/B tests have allowed us to continuously improve for our users AND search engines.