12 Surprising A/B Test Results to Stop You Making Assumptions

By , September 19th, 2012 in A/B Testing | 62 comments

[The 12 tests below have been used with permission from WhichTestWon]

In my last article, critiquing landing pages for conversion, I finished with a page that worked, but that broke all the rules. You might infer from this that the rules don’t matter, but that’s not the case.

Sometimes the rules don’t matter, but only when you’ve tested a page to make sure. You can bet your bottom dollar that Match.com (because that was the site in question) A/B multi-variant tested that page to ensure they optimized the conversion experience.

In this article we’re going to look at some examples of A/B testing and delve into the reasons why some worked and some didn’t.

We’ll also look at how the counter-intuitive approach is sometimes the one that worked. So on we go: Some really surprising A/B test results and some that might make perfect sense to you. In the end, this is why we test, right? What might seem obvious to you, could be not so much for your visitors.

All of the tests are taken from Which Test Won?, an online repository of testing results across many different criteria and communications types (including landing pages, email and banner ads).

Test 1: Which Copy Increased Trial Sign-Ups

In this test, Version B increased sign-ups by 38% – a big rise. However, your gut feeling might initially be that version A is the better design. It’s a got a clear, bold headline and a short piece of supporting copy.

Why does Version B work?

Simply because the copy blocking is better. The headline is shorter, and the sub-heading is designed to pick out some key features in bold. It’s not as pretty, but the information transfer to the user is more efficient because of the emphasis within the copy. The learning from this is that a clean design doesn’t necessary mean an effective one. As we’ve seen before (see test #10), landing pages don’t have to be pretty to be effective.

Test 2: Which Landing Page Got 24% More Lead Generation Form Submissions

Surprisingly, Version A was the page that got the 24% increase in submissions, simply by removing the image from the page. Images can be very effective at communicating information and setting tone, but in this case it affected the landing page’s effectiveness in two ways: 1) the image pushes the form down the page, limiting it’s impact and drawing our attention away from the form, 2) the image is distinctly ‘stock art’ in flavor – be careful when selecting images, they can lessen impact if they are overly corporate, or in this case, simply bland.

This is a great example of why you should confirm your assumptions with quantitative testing.

Test 3: Which Radically Redesigned Form Increased B2B Leads By 368.5%?

There’s an obvious winner to this test, but it’s not just the obvious elements that make the difference. You’ll look at the bright red button and the additional copy, even the images, but for me it’s the form itself that has an impact on the conversion rate.

There’s a massive difference in the amount of data you have to hand over; version A keeps things really tight and uses grouping to visually shrink the impact of the form – look at the number of control labels. The overall result is that the user feels it is less of a chore to complete the form over version B.

When designing your landing page, don’t overestimate your user’s tolerance, goodwill, and patience.

Test 4: Does Matching Headline & Body Copy to Your Initial Ad Copy Really Matter?

Here’s a great example of why you should test, and not one that’s immediately obvious. In fact, when asked to judge which one was more effective, 100% of the people (at the time of writing) got this one wrong, including me. Version B looks as if it should be better: the headline copy is snappier, the sub-head clearer, but in tests version A increased leads by 115%.

Why? Simply because the copy on Version A was designed to tie in with and complement the PPC ads that drive users to the page. There’s a lesson here, even for the experienced landing page builder: the sales funnel consists of many elements, making them work together is paramount in increasing its efficacy.

Test 5: Which Landing Page Increased Mortgage Application Leads by 64%?

It’s easy to get caught up in trends – especially where video is concerned. Quite rightly sometimes; Video can be a very effective tool in communicating lots of information in a compact form. But the presence of video couldn’t save Version B from the rubbish bin; Version A increased leads by 64%.

In this example, it’s the way the elements have been laid out make for an ineffective page.

The video is consigned to the bottom of the page where it can have very little effect. Also, the CTA is at the top of the page, giving great visibility, but it’s placed before the user has any context – you can’t put the form before the explanatory copy, they won’t know what they’re being asked to sign up for (and no, ‘Speak to an Aussie Broker first!’ doesn’t count as explanatory copy).

Test 6: Does adding testimonials to your Homepage increase Sales?

Two pages, very similar in design, but with one difference: Version B has some testimonials included below the fold. It would seem that this would have very little effect, but in practice this small change increased sales by 34% – a big margin.

Why is this? Having ‘social proof’, even in this basic form, humanizes the conversion experience, engendering trust and allowing the user to identify with other consumers. Don’t underestimate the power of this simple tool. That’s not the whole story though. Interestingly, this test was run again, this time with the testimonials at the very bottom of the page. The result was that there was no real impact on sales. So it is still important where you place your testimonials; they must be in plain sight to be effective.

Test 7: Does Social Proof Increase Conversions? ‘Big Brand Prices’ vs. Consumer Ratings

‘Social proof’ is a powerful tool that can have a demonstrable effect on conversion outcomes. LET’S GO CRAZY! LET’S SOCIALLY CONNECT EVERYTHING! Except no, let’s not. You remember when we said confirming assumptions to ensure that we got the most optimized approach; here’s an example of ‘social proof’ that, if implemented, would have cost the company hundreds of thousands of dollars a year.

The difference between the two pages is slight, in Version B the ‘Big Brand Price Check’ had an additional rating attached showing the community ratings for each of the suppliers. Yet this small change had a massive impact, surprising the team who designed the page. The reason? By adding an additional element to the ‘Big Brand’ area, they actually confused customers, instead of empowering them. During the conversion process for this industry, consumers simply want to know whether one insurer is cheaper than another.

By asking them to process an additional piece of information – customer satisfaction – consumers were backing away from the conversion because they were unable to make a straight comparison. This is an example of how important it is to understand your consumers and their expectations of the conversion process.

In this case, simple is better.

Test 8: Does an Email Security Seal Help or Hinder Lead Generation Form Completions?

Here’s a really simple example, which beautifully illustrates how our assumptions can be very wrong. Just like example #4, the majority of people got this wrong when asked to judge which was more effective. It’s easy to see why; surely adding an eTrust image would improve form completions, it makes everyone feel safe and secure, doesn’t it?

The answer is no. Although it’s only a slight rise of 12.6%, Version B, without the image, proved to be more effective. Images of this type are usually associated with payments (especially credit cards and the like), so users were put off by seeing it in this context, assuming that they were about to pay for something. That might seem strange to the more web-savvy amongst is, but don’t assume you know your user, and don’t underestimate their naivety online – we live and breathe the web, they might not.

Test 9: Which Final Page in a 4 Step Sequence Got 439% More Leads from PPC Traffic?

Two nicely designed pages here, both from originally California-based California Closets (the clue’s in the name). I’ll make a bet you went for Version B. Yes? No? It’s got a video centrepiece, well-structured copy; it should convert. Well, if you did, you would be wrong. Version A produced a staggering 439% uplift in leads.

If you’ve got two pages that are well-designed and both, seemingly, doing a good job, it’s easy to take your eye off the ball. Don’t. A continuous A/B testing regime, in which tweaks and redesigns are checked on a regular basis, can have a big impact. Just because a page is up and running doesn’t mean the work has been completed.

If I was to make a call on why Version A worked better, I would look to three elements:

  1. The placement of the text within the image brings the message into focus more quickly and increases impact
  2. The form is that little bit shorter, both in the number of fields and the way it is displayed visually
  3. The addition of the two images give the page a more authentic and trustworthy feel

Test 10: Which Page Got an 89.8% Lift in Free Account Sign Ups?

Although this caught a few people out when asked, I think this is much clearer. A good landing page communicates information quickly and efficiently. It uses good copy and also good typography to achieve this. Version B does this much better than Version A. It has three bullet points, each reinforced with a tick, as opposed to words in speech bubbles.

The removal of the tabbed navigation also helps Version B, removing a distraction from the page and reducing the risk of users navigating away from the page during conversion.

Test 11: A/B Button Color Test: Which Page Drove More Clicks to a Lead Generation Form?

Ah! The simple button color test. As we’ve seen, just a single element on the page can have a big effect on conversion rates. None more so than the button you use to submit the form you’ve just filled out. Here’s an example of how not to style your button (in this case your ‘Get it now’ button).

Although green is an affirming color that signifies positive action, in this case it’s been used with white text which completely washes the button out. It’s hard to know what the button is for at first glance. Version B’s yellow and black button may be ugly (and I mean ugly), but it is clear and led to a 14.5% increase in conversions.

Test 12: Which PPC Landing Page Got 98% More Trial Downloads?

For our last example, here’s one that shouldn’t be a surprise to all you seasoned landing page A/B testing experts. A/B testing can help you to optimize your landing page. It can help you squeeze that extra 5% out of your current page to really make it the best it can be. It can also be used to simply make sure you’ve done the basics right. A/B testing can give you the insight to develop your page with confidence.

In Version B we can see a page that does the basics really well – it’s simple and clear of distractions (no extraneous navigation), has good headline copy and supporting copy block detailing the benefits and features, and has further supporting information (reviews) for those that want to read further. The result was a 98% increase in trial downloads over its rival.

So, sometimes our assumptions are right, sometimes they are wrong, but two things ring true: First, make sure you get the basics right before you start testing, and second, always be testing, because unless you test, you can never be absolutely sure.


Have you ever been shocked by the results of an A/B test? Share your experiences with us in the comments, and remember, when you assume, you make an ass out of you and me.

– James Gardner


This is a guest post, all opinions are those of the author.

James Gardner is a digital technology strategist. Now working in the pharmaceutical industry, he previously worked at Volume, one of the largest independent B2B digital marketing agencies in the UK.

Comments

  1. Jean says:

    Awesome! Thanks for sharing these insights… Test #8 has been the biggest surprise for me… Note to self: take off security or privacy seals when in doubt!

  2. Chris says:

    “Surprisingly, Version B was the page that got the 24% increase in submissions, simply by removing the image from the page.”

    Version B is the one with the picture…

  3. Always interesting to see. You can learn a lot from this.

    For Test #9 I feel the major element for higher conversion rate (version A) is the combination of the much better copy in the headline and the photo – you can easily picture yourself having a such a neat looking closet.

    • I agree, the messaging is in much clearer, and its placement gives it real impact.

      And yes, it would be good to have a closet like that, mine’s nowhere near as tidy!

      Cheers,
      James

  4. cfrost says:

    Good article as always buuuuuuuut………..

    Clicking the images (or the magnifying glass) doesn’t bring me a larger version and clicking image source makes me want to sign up for some crap. Gross!

  5. Gagan says:

    Great Article! Very real for Visual Designer like me to understand the basics of designning something.

  6. RB says:

    Great article but would have been good to see larger images. It’s hard to test yourself when you can barely see the image. :( Thanks for the amazing insights though, loved it.

  7. Craig says:

    With regards to number 9, I think the video vs image might have an effect too. The image in A clearly shows the proposition and does a good job of making the customer see what they get from the product – a beautiful walk-in closet.

    The video, in particular the snapshot shown, is far less effective in communicating that. There is a little girl painting. What has that got to do with wardrobes. The message loses it’s consistency there at the end.

    It seems to me that videos work better early in the sales funnel to communicate the entire idea and give an overview.
    By the end of a four part cycle I think it’s too late for video and the strong image works better.

    Furthermore I think you always have to assume (I know you said not to do that!) that the prospect won’t watch the video. So it can’t contain the most important sales messages.

    • Spot on! The snapshot for the video is not optimal for this purpose. When selecting video service it’s important to ensure that you have contol over which snapshot is shown.

  8. This is interesting, but you haven’t said anything at all about where you got the information about *what* makes one version perform better than the other. Is this just your supposition or is there data to back it up?

    • Jessica,

      The site Which Test Won? has details on the results of each test – so you can see quantatively what happened. They also provide information on what was changed (although it’s possible to pick this up from the A and B version images). The rest is down to the application of best practice and experience. I hope that helps.

      Thanks for your comments,
      James

  9. jery says:

    Can’t see the larger images, unless I become a premium member. As it is, the images are too small to see the differences.

  10. I’d agree with Jessica, and ask where you are getting the information from on ‘why’ one page is better than another?

    If the post is about ‘be careful of your assumptions’, without evidence, aren’t you just making more assumptions?

    • Andrew,

      Please see my answer to Jessica’s comment above. Hope that helps!

      Cheers,
      James

      • Thanks James for taking the time to respond.

        I still question the ‘why’ of some of these results.

        Yes, Which Test Won provides quantifiable measures of change but the explanations of ‘why’ one page is better than another often appears – IMHO – guesswork.

        Indeed, many of the explanations (in the blog post above, elsewhere on other CRO sites across the web, and in meetings I attend!) are similar to what I think psychologists label as ‘post-rationalisation’.

        Or, in other words, we overlay a rational argument after the event when, in fact, we don’t really have a solid explanation (especially when the subconscious is mainly in charge).

        I suppose, unless you accept that you can never have ‘perfect knowledge’, then we just keep moving forward whilst accepting that, to a greater or lesser extent, we are acting on assumptions.

        Thanks :)

  11. Hi James, I did an A/B test for a sign-up page where the results completely surprised me. You can read about it here: http://www.smileycat.com/miaow/archives/001704.php

  12. Hi James,

    This is a brilliant article but there is a key point that has been missed in these tests. The landing page is your first point of contact and not the full story. What may fail here may go on to win big in the sales funnel. What needs to be measured is impact on consumer over the entire sales funnel and what lead to those changes. For example, though video loses out in many tests, when it comes to recall, video often truimphs.

    • Thanks Kishore, you raise a very good point – that of direct conversation versus message recall. In this article we’re concentrating on direct conversion, but of course there is a bigger picture.

      Great comment,

      James

  13. [...] This week it’s A/B testing week over at Unbounce. My contribution to the series of articles is 12 Surprising A/B Test Results to Stop You Making Assumptions. [...]

  14. great article thank you for sharing it with us i really enjoyed reading it, it is very interesting and helpful to many people including me.

  15. [...] 12 Surprising A/B Test Results to Stop You Making Assumptions [...]

  16. Steven says:

    James, I like your explanations for the test results. I subscribe to Which Test Won? and what’s really interesting is how hard it is to pick the winner. Week after week it’s rare for visitors to the site to pick the winner, usually there’s around a 50-50 split – which proves to me that you need to test.

  17. [...] 12 Surprising A/B Test Results to Stop You Making Assumptions 0 Upvotes Discuss Flag Submitted 3 seconds ago by: Monica Marx [...]

  18. Ed says:

    Great article.

    For test 3, I presume that Version A won, so should this sentence:

    “The overall result is that the user feels it is less of a chore to complete the form over version A”

    not read:

    “The overall result is that the user feels it is less of a chore to complete the form over version B”?

  19. [...] 12 Surprising A/B Test Results to Stop You Making Assumptions [...]

  20. [...] folks at unbounce have rounded up a great collection of landing page A/B tests where the results may not be what you’d expect at first [...]

  21. [...] 12 Surprising A/B Test Results to Stop You Making Assumptions [...]

  22. java guru says:

    One thing you pointed out was quite interesting to me. Different industries have different customer requirements and things that might work well for one will not work well for the other. Like #7 where social proof was distracting from the real seller: the price. Price is not usually #1 factor but for some industries like insurance it probably is (insurance is something you don’t want but need to have and you wanna get away with paying as little $$ as poss).

    • James Gardner says:

      Absolutely. One thing is for certain: to create a really good landing page you must understand your business. Selling a commodity product is very different to selling a value-add service; the decision making process is more involved for services and much less impulsive, especially as the cost increases.

      Check out the match.com example in one of my other posts to see how a business has broken all the landing page rules, but only because they have a really good handle on their business.

      Thanks,
      James

  23. [...] like http://whichtestwon.com/ and blog posts like 12 Surprising A/B Test Results to Stop You Making Assumptions make you realize that often this isn’t true. In our experience, we’ve found our A/B [...]

  24. [...] #7. 12 Surprising A/B Test Results to Stop You Making Assumptions [...]

  25. [...] Some of the world’s most effective landing pages break all the accepted best practices. No amount of research can indicate what’s going to be the most appealing to your company’s prospects. A/B tests can be surprising, which is why they’re just not optional. In fact, A/B testing site Unbounce has found that breaking the rules can work. One study showed that adding text below the fold or adding more copy to your page increases your conversion rate by 89.8%. [...]

  26. scientist says:

    I love these case studies. In the course of running a/b tests at work, we often find that the results are shocking, but in retrospect, it all starts to make sense.

    That’s why my #1 rule is: “Don’t every count anyone’s ideas out”. There have been times where my idea wasn’t even written down on the white board because everyone thought it didn’t make sense. I was persistent and it made it into a test cell… It won 4-6% increase in dollars per visit…

  27. [...] your landing page doesn’t correlate to what the PPC ad offered, then visitors will be put off and won’t do business with you.  Nothing is worse than being “sold” on A but getting B instead. Don’t rely on misleading PPC [...]

  28. [...] entire article is worth reading, but I especially like this section below about the conversion impact of a poorly [...]

  29. Ezrad Lionel says:

    A/B testing is nonsense. It’s the next gen of snake oil. Confirmation bias a it’s best. now A/B/C split testing is the real deal.

  30. Laust says:

    Really nice cases that really shows that you shouldnt trust your own assumptions. Data, testing and numbers is king.
    Have a nice weekend.

  31. [...] make a lot of money. In Prosperity, David Wood P.S. Leave me your thoughts, comments and questions.This is part three of "How To Build A Sales Funnel", to read part two click here. 6. Create High-End…rs should land on when they purchase the mid-range product. This list should be designed to sell [...]

  32. [...] 12 Surprising A/B Test Results to Stop You Making Assumptions [...]

  33. [...] California Closets did a test with their headline, in which they decided to create a variation that was more related to their advertisements. [...]

  34. [...] California Closets did a test with their headline, in which they decided to create a variation that was more related to their advertisements. [...]

  35. [...] California Closets did a test with their headline, in which they decided to create a variation that was more related to their advertisements. [...]

  36. [...] Closets did a test, in which they decided to choose the most suitable for their ad [...]

  37. [...] There will always be an element of guesswork involved in designing a website. Optimizely lets you test different variations of your website to see what users think. There is a good example of the information you can gather from testing here. [...]

  38. [...] back to 15 years ago!  But some simple techniques can make a huge difference for you!  Here’s a great study on a seemingly subtle change in the headline and text of a CTA.  The results will probably surprise [...]

  39. […] key reason to start adding A-B testing to your marketing strategy is the potential improvements to your […]

  40. […] of the content can have dramatic results on whether your content catches fire or not. Just like testing landing pages and calls to actions can have huge impact on conversion rates, so can testing different elements of your […]

  41. […] testing can be enormously successful. Some companies saw an increase as much as 368.5% more leads by A/B testing their […]

  42. […] Check out 12 Surprising A/B Test Results to Stop You Making Assumptions […]

  43. Smriti says:

    The post seem super interesting, James. The examples are great! But it is a little annoying when you are referring to an image and there’s no option to see the larger version to understand what you’re talking about. See #7 for example. Still, thanks for this awesome post. It’s very useful.

  44. […] a CTA but­ton on your website’s land­ing page from light green to yel­low, for instance, can increase con­ver­sion rates by […]

  45. […] of a CTA button on your website’s landing page from light green to yellow, for instance, can increase conversion rates by […]

  46. […] But in all seriousness, websites with photos on them tend to have a higher conversion, which is why people will resort to stock photography over nothing in most cases. Like Version A below got 64% more mortgage applications. (More cool studies here, including one wher…) […]

x
Get actionable optimization tips delivered straight to your inbox.

You'll learn:

  • What it takes to build successful marketing campaigns
  • Why your landing page design and copy might be working against you
  • How to increase conversions while delighting leads and customers