CATEGORY
DID YOU KNOW WE HAVE A PODCAST?

Become a Better Marketer. Anytime, Anywhere.

Listen and learn on the go with Unbounce’s Call to Action marketing podcast. Tune in and get inspired in the car, while you cook, or at the gym.

5 Surprising A/B Test Results from Ridiculously Successful Entrepreneurs

So many marketers find it helpful to learn from the A/B test results of larger, more established businesses who have done the work of running regular tests on large volumes of traffic.

surprising-conversion-insights-650
Lots of traffic isn’t always a bad thing… Image source.

Of course, all audiences are different and your results may vary, but these case studies are often a great place to start. They can provide some serious inspiration and push you to think outside the box.

Today, we’re going to look at some of the less conventional A/B test results from successful companies, and how you can get big conversion boosts from trying unusual things.

I asked five entrepreneurs who have built successful businesses about the test results that have surprised them most along the way. Here’s what they had to say.

Wade Foster and Mike Knoop, Zapier

zapier-conversion-insights

Wade and Mike are two of the co-founders of Zapier, the tool that helps more than 600,000 businesses integrate their favorite apps together without having to write any code.

Zapier’s homepage was pretty simple. So simple, in fact, that they got criticized for it in a UserTesting.com post:

zapier-homepage-conversion-insights

Ouch.

But as Mike shared in the comments, this simple page was the result of a surprising test.

For the homepage you highlighted, we tested several versions (including simply swapping the top orange and bottom explainer sections).

zapier-homepage-test-conversion-insights

The [version mentioned in the post] converted signups and active users more than the others (by a statistically significant margin).

zapier-homepage-test-stats

I was really surprised. My gut reaction agrees with this post but I was wrong in this instance. It’s proof that you should always mix qualitative and quantitative testing and never opt strictly for one or the other.

Mike’s point is an important one: it’s easy to look at a page and judge it “qualitatively” based on how it looks to you.

But that doesn’t tell the whole story.

The aesthetic of a page is one thing. But if a beautiful page doesn’t convert, it’s not useful. An ugly page that does convert, though, still makes money.

Wade adds:

We’ve since redesigned the homepage again with better results, but at the time this was really surprising.

Takeaway for your landing pages

The “best practice” is to make it immediately clear what your business can do for your visitor. But best practices don’t always win out. Zapier found this in their homepage test, but it applies to any landing page you’re working on. Play with your copy and test variations that provoke your visitor, whether they’re directly about your business or not.

David Hauser, Grasshopper

grasshopper-david-conversion-insights

David co-founded Grasshopper, a company that he helped grow to more than $30M in annual revenue.

David’s team wanted to test whether crossing out the $25 activation fee on Grasshopper’s pricing page (to show that the visitor wouldn’t be charged the usual fee) would increase conversions.

Unfortunately, the tool they were using wasn’t perfect.

Our A/B testing tool had a bug that delayed the $25 activation fee from being crossed out until a few seconds after the page loaded.

This error ended up creating a much larger uplift than having it already crossed out on load, when the bug was fixed.

The result now is that the activation fee shows, and then is crossed out after a few seconds.

grasshopper

Takeaway for your landing pages

Try using buggy testing software.

Okay, not really.

Get creative with the pricing on your landing pages, and test dynamic flourishes. If you’re offering a discount, try having the discount appear after a few seconds, once the full price has soaked into the visitor’s mind.

Neil Patel, Quick Sprout

neil-patel-quicksprout-conversion-insights

Neil is is the co-founder of Kissmetrics and Crazy Egg, and the founder of Quick Sprout.

He’s run a lot of tests, but even Neil fell victim to the assumptions of “prevailing wisdom.”

I used to believe that making the checkout process simple by having everything on one page would always boost conversions.

But in one test, I split things up onto two separate pages, and got an increase in conversions by 11%.

I was shocked.

Takeaway for your landing pages:

Simple isn’t always better. Sometimes making your landing page visitors work harder to convert can work in your favor.

Experiment with adding additional pages, form fields and steps to your signup process.

Joanna Wiebe, CopyHackers

joanna-wiebe-conversion-insights

Joanna is the co-founder of Copy Hackers and Airstory, and one of the web’s top experts on all things landing page-related (get this free ebook, seriously).

Joanna was working with Metageek to help them increase sales of their downloadable software products.

The first page Joanna and her team turned their attention to was this one, one of Metageek’s most highly-trafficked pages:

metageek-homepage-conversion-insights

Notice the download links in the left column: they’re text!

Joanna’s first question was: if we replace the text with more visually engaging buttons and more urgent messaging, will clicks increase?

And so they tested several variations, each of them incorporating big, colorful buttons instead of Metageek’s boring old text links:

metageek-homepage-conversion-insights-2

The result?

Every single test with a button reduced conversions, anywhere from 6% to a whopping 80%.

Yikes.

Metageek’s Wendy Fox, understanding her audience, offered an explanation for why the buttons were so unpopular:

It’s often a plain text link these days that gets you the clean download [ed. note: meaning that the link isn’t an ad or malware]. We could be experiencing “seasoned internet user” behavior on the download page.

Takeaway for your landing pages

Boring doesn’t necessarily mean bad. Text links on your landing page could be a great way to gain your visitors’ trust in a world full of big, colorful buttons competing for their attention. But you gotta test.

Hiten Shah, Kissmetrics

hiten-shah-conversion-insights

Hiten (along with Neil Patel), is the co-founder of Kissmetrics and Crazy Egg, and the man behind one of the best weekly newsletters in SaaS.

Like Neil, Hiten assumed that easier was better when it came to conversions. And like Neil, a homepage A/B test showed Hiten that his assumptions aren’t always true.

Rather than the standard email address or name field, the test called on visitors for some pretty unconventional information.

When we added just a single form field on the homepage, versus just the button, our conversions went up 36.5%.

kissmetrics-homepage-conversion-insights

It wasn’t an email address field; instead, it asked people to enter their website URL.

Takeaway for your landing pages

Try asking your landing page visitors for information that’s different from what everyone else is asking for. We’re used to seeing forms that ask for our name and email address, but if you ask for something unconventional like a URL, it may catch your visitors’ attention.

What unconventional tests will you run?

“Best practices” tell us some of the more conventional things that we should all be testing, like our headlines and calls to action.

But thinking outside the box and running unusual tests is worth it too, even if they go against what the experts are telling you to do.

Hopefully these five case studies have given you ideas for your own unconventional tests.

Give them a try… you might just be surprised.

About Len Markidan
Len Markidan is the head of marketing at Groove, makers of simple help desk software for teams. Read his weekly posts on the Groove Customer Service Blog and follow him on Twitter.
» More blog posts by

Comments:

  1. Phil

    The comment about “seasoned web users” should ring true to anyone involved in a/b testing. The perception and knowledge of the user is always evolving. A failed test today, might be a successful test in a year.

    (3)
    Reply
  2. Jaap

    My first thought when I saw the CopyHackers page was, those image buttons look horrifically like dodgy ads you see on torrent sites or software download sites. I don’t think the lesson is necessarily to discard image-based call to actions, just that they should be well designed so they don’t look like they’re going to open 10 popups.

    (2)
    Reply
    • Paul Mather

      I have literally just commented the same. They look EXACTLY like the cheap dodgy ads you get on download pages. How was that not apparent to the team working on that? Then to conclude that for them, text links work better than images…

      No, text links work better than badly designed image buttons.

      (1)
      Reply
  3. Al Elliott

    Does anyone know what A/B Testing software Zapier are using? (Or do you have a preference yourself?)

    (0)
    Reply
    • Mike Fiorillo

      From the screenshot it looks like KissMetrics. So I’m guessing they’re using an in-house tool to randomly serve up different homepage versions to visitors and then using KissMetrics to measure the performance of each one.

      There’s a good reason for this too. Serving up vastly different homepages using javascript based split testing tools like Optimizely, VWO, etc.., is tough because you typically need to redirect visitors to different pages. This can slow things down and potentially cause some SEO issues if you’re not careful.

      (2)
      Reply
      • Al Elliott

        Wow! Thansk for this awesome reply!

        So, in your opinion, what’s the best ‘begginner’s setup’ for Split testing? I’m using Instapage (like LeadPages) and it has variance tested baked in, and about to install Woopra to track visitors. I have Google Analytics too.

        (0)
        Reply
        • Amrdeep Athwal

          If your going to be doing basic testing i would use a A/B testing platform for the wsiwyg edito so that you can change headlines, bottons etc..

          If you are doing radically diffrent pages something like unbounce or using a a/b testing platform like opimizely to do a URL redirect test.

          When you are starting testing I would stick with the wsisyg editor when your do more advanced redesign I would go for the url redirect option

          (0)
          Reply
  4. David

    But in one test, I split things up onto two separate pages, and got an increase in conversions by 11%. i like this point because without this can A/B testing according to me.

    (1)
    Reply
  5. Josh Rodriguez

    This is really great! A lot of good insights and reminders to periodically go against the normal run of the mill. Thanks for putting this post together.

    (0)
    Reply
  6. Netta

    The mention about Joanna’s test and the buttons really stuck with me. Initially I would have gone for a buttons test too! But when you think about it, it’s true. People now avoid the flashing banners and the big “download now” buttons because they are associated with ads and tricks. It’s so important to test and to keep in mind that you’re testing for a reason! To learn more about the user! A truly great reminder about continual testing and how some things that may work for one place may not work for another. Thanks Len!

    (0)
    Reply
  7. Sivakumar

    Seriously this is a great read about Zapier A/B tests, Like the same I wondered how Facebook did A/B test in Japan and Russia. The way the Facebook growth team done was an awesome testing and they got fruits http://www.simplilearn.com/how-to-become-a-growth-hacker-article

    (0)
    Reply
  8. Charles Prescott

    Thanks, great post!

    (0)
    Reply
  9. Cartesian Creative

    A/B tests should never have surprising results. You are choosing between two carefully considered design choices, not randomly throwing ideas at a wall to make them stick.

    That one of the design patterns was discovered by an accidental bug enforces the point: you should not be getting good design by accident.

    Design well, then test the output of that good design with A/B testing.

    Don’t punish A/B subjects by experimenting with design guesses.

    (0)
    Reply
  10. Small Business Accountants in Hull

    After looking at a number of the articles on your web page, I really appreciate your
    way of writing a blog. I book marked it to my bookmark webpage list and will be
    checking back soon. Please check out my web site as well and
    tell me how you feel.
    We have a passion for helping small businesses which
    will stems from very successfully managing many of my own over
    the years. I realize what is needed to make a small business a success and what
    is required to keep the books in order and also minimize tax liabilities.
    I decided to use my accounting knowledge and all my experience
    to be able to good use and linked the DNS Accountants team, opening
    their first part in the North of The united
    kingdom. Along with my accountancy diploma I am in a great situation to
    help businesses in Hull thrive. My personal aim is always to help business owners meet all their goals and achievements by working with them, understanding the requirements and the most important goal involving any business, helping these people be more profitable!

    (0)
    Reply
  11. Paul Mather

    In the example of CopyHackers, those buttons look like images.

    What they and we as lead generators/conversion optimisers tend to do, is make assumptions, where testing often proves us wrong. But then we make assumptions about the outcome, as seem here in the “Takeaway” section.

    For example; “Simple isn’t always better. Sometimes making your landing page visitors work harder to convert can work in your favour.”

    That is an assumption that the two page process increased conversions because it made users work harder. Perhaps it was less overwhelming? Perhaps it was something else. But the point is, it is important to get out of the habit of jumping to conclusions and making assumptions.

    (0)
    Reply
    • Paul Mather

      I was supposed to say they look like ads… I couldn’t edit my comment to correct. Doh!

      (0)
      Reply
Comments