Check Yourself Before You Wreck Yourself With A/B Testing

check-yourself-ice-cube
Ice Cube thinks you should check yourself before you wreck yourself (with A/B testing). Image source.

If you’re a marketer, you likely already know the importance of A/B testing. But do you put your money where your mouth is? Are you testing as much as you could be?

You can’t go back in time and start running game-changing tests, but you can start testing right now.

At last year’s Call to Action Conference, Unbounce brought together three conversion rate optimization experts for their panel “Digital Marketing Through A Conversion Lens.” The panelists explained how to gamify testing, avoid assumptions and start testing smarter — and we’re bringing that knowledge directly to you.

What do these experts have to say about good tests, bad tests and everything in between? We’ve distilled their wisdom down to seven applicable tips.

1. Get all your colleagues involved and invested in A/B tests

In her past position as Head of Conversion Rate Optimization at Shopify, Tiffany Da Silva found that she wasn’t working as closely with her coworkers as she would have liked. Because she knew that more hands on deck equaled greater potential, she asked her SEM guy to set up and run a test with her – start to finish. She explained to him:

For you to understand what I do and for me to understand the problems that you’re having right now, we need to pick one keyword and follow it the whole way.

The results were dramatic: “When we actually worked together and did this test for ecommerce and created this one page, we were able to see a 33% increase [in conversions] for just that keyword.”

That was enough to get her coworker on board. Bringing in help from different departments brings in fresh perspectives and a different set of experiences.

And having others invested in the process – start to finish – can result in big wins that foster more enthusiasm. As Da Silva puts it:

He’ll tell other people, and before long I have the guy who does Facebook coming up to me, and the guy who does SEO coming up to me, and we’re all working together to create really big tests.

2. Gamify your testing process

Want to get your team interested in testing, fast? Braden Hoeppner, Head of Ecommerce at Kit and Ace, has a solution.

Go find the person who disagrees with you and put cash on the table.

Put weird teams of people together, put $100 on the table, and say whoever designs the winning test wins.

Joanna Lord, VP of Consumer Marketing at Porch and former VP of Growth Marketing at Moz, agrees: “I lost $10 to my CEO last week.”

If you don’t want to use cash bets, you can still find fun ways to gamify the testing process.

Lord’s team uses a Plinko board: “At the bottom of the Plinko board are all of these things you can win, like gift certificates or Amazon gift cards.” People who create winning tests get to play Plinko and claim their prize.

plinko

When you gamify, remember that you can only win or lose within the context of the game. After all, you never lose when you test as long as you learn something from the test.

“You can’t fail at testing,” Hoeppner says. “Your hypothesis gets proven or disproven, and you learn something.”

3. Don’t prioritize beautiful design over conversion

Hoeppner described a test where he assumed an image of a beautiful model would improve conversion, but discovered that beauty doesn’t always convert. “Nothing happened,” he explained.

Lord also learned that beauty doesn’t always win.

When she and the Porch CEO were testing a landing page, she wanted to use “big, beautiful, bold imagery” — a strategy that wound up losing the test.

The takeaway?

A beautifully-designed page won’t always take the cake.

Design is important, but only if it’s supporting the real star of your page: the copy. Start by writing (and testing) persuasive copy – and then work toward a design that complements it.

4. Don’t assume that testimonials are a magic bullet

Da Silva describes an embarrassing moment where she told her boss they didn’t need to test their testimonials because “testimonials always work.”

Her boss asked her to test the testimonials anyway, and Da Silva quickly saw that she was wrong.

“Taking out testimonials was the biggest win I had,” Da Silva said. She would never have discovered that win if she had relied on her assumptions instead of testing.

So, as Oli Gardner puts it, testyourmonials.

testyourmonials

A/B testing isn’t about following the rules. It’s about learning as much as you can and then questioning everything. And maybe even unlearning some things in the process.

5. Don’t lose track of your company’s voice

Joanna Lord warned testers not to lose their company’s point of view as they optimize their pages for conversion.

“That balance between conversion and the point of view of your company is essential,” Lord explains. “Make sure that you’re not over-indexing on conversion and losing what’s special and beautiful and rare about your brand or your voice.”

At Porch, Lord starts by putting together a list of winning words that represent the brand and weighs them against their conversion potential.

Lord described an A/B test that appeared to indicate that the word connection led to high conversion rates, until the Porch team realized that the people who converted on connection weren’t qualified leads:

Those people were not revisiting. They were not engaging, their time on site was lower, their bounce was higher, their revisit rate was lower.

Understanding what type of language your most loyal customers relate to will help you optimize for the right type of conversions:

We might lose a little of that up-front conversion, but we’re winning in the long haul.

6. Don’t assume that “wins” apply across all customer segments

Even if a test was conclusive for one of your customer segments, the takeaways won’t necessarily apply across the board.

Hoeppner learned this the hard way. When he was testing sliders on his ecommerce site, he found that they lost to a static image.

However, he’d only been testing for his US users. When he ran the test with Canadian users, he got very different results.

Surprise: “Canadians like sliders.”

canadians-scrubs

The bottom line? Best practices can help guide your next A/B test – but err on the side of caution and test to validate assumptions.

7. Watch out for downstream impacts

A test that leads to a conversion increase in one area of funnel may lead to a negative impacts further down the funnel.

“Whenever we change something, there’s some other downstream impact,” Hoeppner explained. “We’ll see click-through rates go up, and then dramatic drops in average order size.”

Newton’s Laws of Motion tell us: every action has an equal and opposite reaction. It’s true for A/B testing, too. When you make changes based on the results of an A/B test, make sure you track both the positive and negative reactions that occur throughout the rest of your conversion process.

Look for the downstream impacts, and make sure they aren’t negatively impacting your conversion rates or sales.

Ready to learn more?

Want some more?

Watch the ”Full Stack Panel: Digital Marketing Through A Conversion Lens” video and learn even more tips from our experts.

Then get ready to improve your testing and your teamwork. Remember that it’s never too late to become a smarter A/B tester. As the Chinese proverb puts it:

The best time to plant a tree was 20 years ago. The second best time is now.

How are you going to improve your next A/B test?


default author image
About Nicole Dieker
Nicole Dieker is a freelance writer and copywriter. She writes the "How A Freelance Writer Makes A Living" column for The Billfold, and her work has also appeared in The Toast, Yearbook Office, Boing Boing, The Penny Hoarder, and The Freelancer by Contently. Follow her on Twitter @HelloTheFuture.
» More blog posts by Nicole Dieker