How A Single A/B Test Increased Conversions by 336% [Case Study]

How A Single A:B Test Increased Conversion by 336 Percent
336% ? No Way! (Image Source)

In the post below I am going to share a step by step case study in which we increased conversions 336% (that’s not a typo or stuck key – it’s a legit 336%) for a client who owns and operates a career college. For this client leads are their life-blood.

By walking you through each step, from the hypotheses, to the strategy, to the goals, the setup, the test, the results and the next steps, I hope to give you a very clean very easy to follow strategy for you to start applying the same sort of approach to your site/pages or your clients sites and pages – immediately after reading this.

At the end I have a very ‘neat analytics nugget‘ that showed up as an ‘Accidental Results‘ discovery in this case study.

Let’s get started

As mentioned above, this post walks you through a case study in which we increased conversions by 336%…(can you tell I love that percentage?). The client runs a college, whose business is graduating very talented students. But in order to graduate talented student, they need to start by enrolling students – and that’s where I came in.

The Test

I start all initial testing with an ‘Apples to Oranges‘ split test.

An “Apples to Oranges” split test, at it’s simplest: if you have PPC budget – you of course want to make the most efficient use as possible of each click per dollar. If you test apples to oranges and find you visitors prefer apples, then you can stop wasting money on keywords that get you clicks for oranges, and start spending your money on tests that help you identify if the apple lovers like red apples or green apples. Then if your tests reveal that your visitors like red apples, you can test further and spend your PPC budget more efficiently finding out if your visitors like a Red Honeycrisp apple, or a Red Jonathan…and so on and so forth.

The A/B Test (or Apples to Oranges) I ran for this client can be seen below:

Apples to Oranges A/B Test
Click for full-size image

The original (Version A) was converting at 3.12%. That conversion statistic in itself, is not great, but also not too bad considering it was less that 2% when we took it over and immediately applied as many best practices as we could (as bandaids) in order to “stop the bleeding” from the original site.

The Hypothesis

The hypothesis for this test was simple:

Removing primary navigation and modifying the form layout placement above the fold will increase (improve) form submissions

The Test Setup

Mobile traffic was high – mobile traffic has been increasing in general, but for this target group ‘potential college students’ it was really high – 59.84% during the last 90 days, including this test timeframe, was Mobile.

  • Both A & B Versions were mobile responsive
    • We noticed with this client that mobile traffic seems to support the middle of the funnel – but the conversions happen (99.9% of the time) from a web browser
  • The B Version removed Primary navigation, had a dramatically different theme with the same colors, and had a single image with a larger “human”
  • The A Version had a Quality score of 8, while the B Versions had a Quality Score of 10
  • All traffic for this test was from Google Adwords – Pay-Per-Click marketing.
    • By making URL’s for Adwords only available from Google Experiments via Adwords you can ensure no organic traffic comes through and your referral traffic will also be almost non-existent.
  • For Data and Page analytics, we used Google Analytics
    • It’s important to note that on all tests, we set filters to filter out all Internal and development traffic in order to get as accurate traffic and test results as possible
A single AB Test increased  conversion by 336% - The Test Setup
  • For rotating the A & B Versions of the pages we used Google Experiments
      • **Please note, while Google Experiments is a very decent tool, at times they ‘take too much control’ of your testing, often stopping a test and declaring a winner automatically. This is why you will see a lower number of visitors than what we recommend (100 minimum) on the test below.
  • The test ran for 23 days
  • In 23 days it had exactly 120 experiments – Pure PPC visits
    • This is nice because some tests can take a LOT more visits and experiments to determine a clear winner.

The Results

The winner by a landslide was the Challenger – the B Version.

AB Test increased conversion by 336% - Champion Variant
Click for full-size image
The probability or confidence percentage that the champion variant would outperform the original was 95.2% and conversions increased from 3.12% to 13.64%.
AB Testing Analytics
*Editors Note: Unbounce recommends sending at least 100 visitors to a landing page; However, due to the tool used this was not possible (Click for full-size image)
So with this hypothesis, test and results, we clearly had a winner.

Additional Learning & Results

But wait, there’s more!

While having a winner, and validating a hypothesis is rewarding – the real magic happens sometimes when you dive deeper into the metrics and analytics “around the perimeter”.

In this case the “bonus” findings were:

  • Bounce Rate Improved from 32.50% to 28.70% – pretty cool, but not earth shattering
  • 88% of the clicks were below the fold. Now this is worth a look, right? Especially since we didn’t have any navigation on this page (other than social links opening in a new tab/window)
A/B Testing Accidental Results
Click for full-size image
A/B Testing Accidental Results
Click for full-size image

Visitors were scrolling ALL the way to the bottom and clicking through the testimonials! This is very likely the Middle of the Funnel people looking for that coveted “social proof – “What are other people saying?” “What are other people doing?” “What are other people buying?” (pst – don’t know what the Middle of the Funnel (MOF) is? Read this post.

Now this is great data to work with because if we have 88% of our visitors clicking around at the bottom, the next test would place a CTA at the bottom of the page near the scrolling button

*Please note: While this test was a success and we learned a ton, the information in this case study is very fresh – only a few weeks old. We are currently in the process of rolling out these changes.

Your Turn

The reason I wanted to share this case study so soon after getting the results was that it’s a great example of how to:

  1. Create a very simple hypothesis.
    1. Remember every hypothesis should be geared to “IMPROVE” a situation. SO when you write your hypothesis – literary spell it out…the test is to try to improve, click thru, conversions forms filled out, phone numbers called etc.
  2. Start A/B split testing to compare to very different versions “apples to oranges”.
    1. If you still aren’t sure about this – contact me in the comments below.
  3. Use a tool to serve and rotate the Control and Challenger pages.
  4. Review the results against the hypothesis
    1. This will be an easy review – did your B Version test/challenger page improve, weaken or not change your baseline of results before you started testing?
  5. Dive deeper into the Analytics to see “what else can I learn about this test”
    1. Get a cup of coffee and literally wade through allreports and metrics for the test and the timeframe the test ran.
    2. Make notes and observations – these will help build your NEXT round of testing
  6. Use the Champion page as the new Control page and create a new Challenger page.
  7. Now get on with juicing more conversions out of your pages or the pages for your clients

I want to quickly thank two people I couldn’t have done this case study without. A big shout out to my associate at COX Media, Alfonso Montemayor and developer, Kyle Sanders.

Keep testing and never hesitate to reach out via email or the comments below – I reply to everything.

– Dustin Sparks

About The Author

Photo of Dustin Sparks

Dustin Sparks, Landing Page, A/B Conversion Optimization Expert. He provides his clients with optimized landing page strategies for improving conversion rates. His comprehensive landing page optimization process includes the usage of advanced A/B testing to determine which landing page tactics will convert best for your specific campaign goals.
» More blog posts by

Comments

  1. Myrtha says:

    Question: I’m new at landing pages. I thought a landing page should let people do one thing and one thing only. But in the Version B, I notice lower right corner you had a menu of links.
    b) I learned that unbounce has limited stats measurement but we can use Googel Analytics to measure. Can you send me link to a page explaining how to do this? Thanks. Myrtha

    • Hi Myrtha – A landing page can do as many or as few “things” as you want :) If someone told you it should only do one thing – I am not sure I totally agree. As for “unbounce has limited stats measurement but we can use Googel Analytics to measure” – I am not sure how to reply. If the reporting with Unbounce is not adequate for your tracking needs – you can certainly checkout other options, but if you are asking how to integrate Unbounce with GA – have a look at this http://support.unbounce.com/entries/307637-Integrating-with-Google-Analytics-VIDEO-

      Hope that help ya.

  2. Ajaz says:

    Wonderful case study and some very helpful tips to gather from this post, thanks Dustin. The only thing I have in mind is, don’t you think 120 isn’t sufficient visits to decide a clear winner? Just curious!

    • Hi Ajaz. GREAT question.!

      I FULLY recommend sending at LEAST 100 visitors to each page variant – and to my point in the post of “not fully endorsing Google Content Experiments because they take too much control” – this test was just “stopped” by Google and a winner declared.

      While that is great – it would be even better if they let the test/experiment keep running to increase things like the traffic metric.

      So on this experiment – we had to go with what they gave us as we couldn’t “restart” it.

      Again – good spot. Thanks for reading and asking.

      • Ajaz says:

        While Google Content Experiments is good, but it isn’t the best tool out there for A/B testing; but heck it’s FREE. Personally, I’d send much more visitors before announcing a winner.

        Thanks Dustin for responding.

        • Nico says:

          As mentioned above, I really can’t agree with these numbers.
          In my personal expierence you need at least 100 conversions, not visitors.
          Of course that will vary depending on your industry, current conversion rate and so on, but 100 visits/variation won’t give you accurate data.
          I predict/assume that in a few months you’ll come to the conclusion that the actual conversion rate is way different.
          I know, playing advocate of the devil here, but I had to share.

  3. Pepe says:

    Amazing! I’m an affiliate and this info inspired me a lot on how to test and why to test variations for my campaigns.
    Congrats Dustin!

  4. It looks like you didn’t give these results a chance to normalize. The sample size is just way too small to draw any clear conclusions. Furthermore, there are some MAJOR changes between the control and the variation, which in its own right isn’t terrible but also doesn’t give too much insight into what works vs. what doesn’t work.

    At the very least, you should aim to get at least 100 conversions per variation before you call a winner (this is merely a heuristic, you should really use a forecasting tool).

    The reason this test had ‘statistically significant’ results was because the lift was just so big, A lift like that will report conclusive results because a big lift requires less traffic. This is an issue for people who rely too heavily on their tech tool. Your tech tool will call a test, but as marketers we need to be smarter than our tech tool.

    • HI Justin – you and Ajaz (comment above) have the same observation – and I agree with you. “at least 100 conversions per variation” is very standard as a minimum – In the future – I’ll post only tests that meet that standard. Thanks for the feedback and the comment – I appreciate it.

  5. Totally agree with Justin’s comments above. While the time span was a proper length, the number of conversions and the number of impressions was just way to low to make a confident call. While the math might show it’s 95% confident, the impression/conversion thresholds are just way too low.

    That said, I do think the version B is a better experience, and could very well pan out to be a big winner, but quoting 336% gain based on these #’s might not be appropriate.

    • Hey Mike – to you and Justin & Ajaz’s point(s) – I’ll be more disciplined to only post results with at least 100 conversions per variation. I completely agree – and in hindsight if I was reading this – I’d have the same feedback as you 3.

      Thank you for the feedback.

  6. Steve says:

    Would be interesting to see what the conversions to actual enrolments was. Did that go up too?

    Anyway, still a good example of the difference in response changes in the design can make.

  7. Lavern says:

    Dustin, thanks for this inspiring post. Do you have a tool to recommend that will let me do 100 conversions per page? Or should I manually do this?

    • Hey Lavern – I try to keep up on as many tools as possible- but recommending one over another on a public post might bring me some “heat” – connect on LinkedIn or twitter and message me.

  8. Dustin, thanks for great article. I have one question which I woold really want to ask you.

    What conversions to registered users are better – natural (filled the form themselves) or facebook (clicked the button F-connect)?

    In first case (natural) we have less info, but it really takes a use a while to fill the form. In second, facebook register – we have much more data on the user, but it took’em just on click to register, so he might be less interested.

    • Hi Denis – thanks for the comment – but I am not sure I fully understand the question “What conversions to registered users are better”.

      For this client a conversion was a completed form. Phone calls were tracked (and spiked – especially from mobile traffic) but this data/case study was tracking FORM conversions.

      Does that help?

  9. GMO says:

    Great example, thanks.

  10. A new way to increase conversion rate

    “Thank you page” interactive video that can get your customers to “facebook like” your website or just get them to buy more :) .
    With VideoStir you can now easily add a personal interactive clip to your “Thank you page”. Once your customers complete their purchase process, is the perfect time to call them to action. Using a floating interactive clip (“thank you page video”) you can tell your customers to:
    “Click on me to “facebook like” my website”
    “Click on me to “facebook share” my website with your friend”
    “Click on me to buy an upgraded product/service (up sale)”
    “Click on me to see our daily deal”
    Now with VideoStir, that creating interactive clips is so easy, you can make one for each case and get more out of your customers.
    After paying your customers are in a good mood and will be happy to see a personal video offering more deals.
    Lastly, you can always just put your marketing manager or CEO floating clip that thanks your customers for buying your service. That’s the right way to close a deal.
    Try in with VideoStir today (Free trial).

  11. If you are considering part exchanging your current vehicle we are able to provide you with a quote. Car Credit Supermarket guarantee low prices on our entire range of vehicles. We check our prices every day so you can be confident that you can’t buy better or cheaper anywhere else.Here http://carcreditsupermarket.com/part-exchange.html

  12. Momoko Price says:

    Very cool! Love this experiment :)

    Wanted to just doublecheck something: I’m pretty sure your description of differences between A and B was kept short for brevity’s sake, but from what I can tell there were also modifications to “high real-estate” copy elements as well, yes? Like headings, form copy, right-column elements, etc.?

    Might be a good idea to mention that there were sig. changes to the content between A & B as well — otherwise people might make the (common) mistake of thinking that you don’t need to think about your messaging/copy, you just test design/layout.

    Just a friendly suggestion from a web-copy nerd :)

    In any case, awesome write-up, v. informative!

    M

    • Hey Momoko – good observation, and that would warrant another follow up post (or 2) as it is too much to cover and stay on topic with a case study “show and tell”. I completely agree it’s important to note the differences, but I simply couldn’t get into each element in this post :(

      PS – “copy-cat.co” – GREAT domain name :)

  13. Julya says:

    Hi,
    Wanted to know how i can setup in unbounce the form to appear in 2 columns, like for your winning LP ?