Usability and A/B Testing – A Special Relationship

By , September 14th, 2011 in A/B Testing | 11 comments

Website optimization teams are more effective when UX Consultants and AB/Multivariate Testing experts work closely together.

Why is the relationship so important?

There are few who would dispute the importance of testing new website optimization ideas (hypotheses) in a live environment, with real users, using tools such as A/B testing software, before final implementation on your website. But the optimization ideas need to come from somewhere.

And who is better qualified and has a better understanding of user behaviour than a User Experience or Usability consultant?

Hundreds of hours spent watching and listening to users, sharing their pain and delight during website interaction, naturally results in a good understanding of what makes users tick. And as optimizing a website is all about influencing the behaviour of users in line with business objectives, their role in the orchestration of test experiments is really important.

A quick counseling session to heal a broken relationship

A Multivariate Testing (MVT) salesman a few months ago at a London conference said to me:

“We enjoy proving usability consultants wrong!”

Whilst this is quite an extreme viewpoint, it’s clear that not all AB/MVT professionals are sold on the idea of usability. And in a similar way, many of the Usability consultants I have worked with over the years have been just as skeptical about AB/Multivariate Testing.

So in an attempt to build some bridges between these two camps, we’re going to start by airing each others issues and concerns.

Here are the types of comments MVT companies say about Usability:

  • “Usability testing is not statistically significant due to the small sample sizes.”
  • “Participants do not behave naturally in a usability lab when there are cameras pointing at them and a moderator asking them questions.”
  • “It’s so expensive as you need to compensate participants.”
  • “It’s just a few individuals’ opinion – what do they know?”

And here are some comments about AB/Multivariate Testing from the Usability and User Experience camp:

  • “Where do the MVT companies get their optimization ideas (hypotheses)? They don’t seem to be based on evidence or insights?”
  • “How can they create the design variations for optimizing the website if they don’t have User Experience Professionals, Interaction Designers or User Interface Designers? And they don’t do Usability?”
  • “Everything just seems very random; it’s as if they’re just throwing things against a wall and seeing what sticks.”
  • “How could a change to the colour of a button have resulted in huge conversion rate increase unless there was an underlying issue with the button in the first place?”

It’s just a question of language

One of the reasons why Usability and AB/MVT experts struggle to understand each other as well as they should is their use of language. They often talk about the same things in a completely different way. And this language barrier as well as their different viewpoints can again make it a little more difficult to get along.

Usability (and User Experience) MVT (and Analytics)
Users Visitors
Recommendations Hypotheses
Design solutions Experiments
Issues Pain-points, blockages, optimization opportunities
User journeys, paths Goal funnels, fall-out reports
Personas Segments
Improving the user experience, making the site easier to use Improving conversion rates
Severity of issues Business impact, confidence



The following Wordle Word Clouds are snapshots of the Wikipedia page content for Usability and Multivariate Testing. Try finding the word “user” in the MVT word cloud or visitor in the Usability wordcloud! Also “Design” is so prominent in the “Usability” wordcloud, and yet so hard to see in the MVT wordcloud.

Usability Wikipedia content wordcloud

Multivariate Testing Wikipedia content wordcloud

The future

UX/Usability guys are wising up to the idea of AB/MVT as a very useful tool for evaluating and fine-tuning all the great recommendations which come out of usability testing sessions and other qualitative research such as surveys, remote usability testing etc.

Websites such as whichtestwon.com have definitely helped raise the profile of AB and MVT in the user experience arena. And the fact that so many of the winning variants are due to enhancements to the user experience – just as often as they are due to persuasive design patterns and messaging – again reinforces everything the UX and usability consultants believe in, and makes them want to use the tool.

The fact that tools are now so readily available increases their exposure to the wonders of MVT. Visual Website Optimizer (for testing existing site pages) and Unbounce (for building and testing new landing pages) are so accessible, there is no excuse not to have a go and learn all about them, even if you don’t know an “experiment” from a “hypotheses”!

Check out the Econsultancy MVT buyers guide for more examples.

Testing companies are now showing a greater interest in usability, or at least pretending to do usability. This suggests they are realizing its importance, or at least have realized it’s a good idea to talk about usability to prospective clients to allay their clients’ fears that the MVT house isn’t a user centric company.

Conclusion

Usability and AB/MVT specialists can and should work together. The disciplines are totally complimentary when the individuals from either camp except the limitations of their own tools and learn to recognise the strengths of the others’.

Finally working in silos is not the way forward. After all, we’re all working towards the same goal; to make your online business more successful. Please let me know your thoughts and feel free to disagree with anything I have said. Would love to hear from MVT companies who are usability testing fans? And usability people who like nothing better than split testing?

Chris Gibbins

This is a guest post, all opinions expressed are those of the author.

Chris Gibbins oversees the User Experience services at RedEye. He has extensive hands-on experience in usability testing, UX research, eye tracking and user-centred design; plus, a background in web development, animation, illustration and art.

He has been instrumental in the development and implementation of RedEye’s Conversion Rate Optimization (CRO) strategy combining Usability, Analytics and A/B and Multivariate Testing.

You can follow him on twitter @cjgibbins.

Comments

  1. gbonthron says:

    It’s simple – good research, UX, design and clear goals = better testing and better results. Not being risk averse also helps.

    • Oli Gardner says:

      Agreed. And you raise a good point re: being risk averse.

      You’ll never get very far in the optimization game if you’re not willing to take some risks. Not all tests will be successful, but that’s all part of the learning process.

  2. Nit says:

    Usability: Who is the target audience, what it needs, and how therefore it reacts to the system.

    Split testing: Delivering various ‘messages’ knowing that one of the presentations (sum of medium, message, value, action asked, etc) will lead to the audience being most positively influenced than the rest.

    Apart from the fact that these two practices evolved from completely different fields, I really don’t see why an expert who optimizes websites would not use a process involving both usability and split testing methods at one or more stages.

    The way I see it. No matter how much you know your partner, you cannot be 100% sure 100% of the time that they will completely enjoy the gift you bought for them. And on the other hand, you cannot keep buying gifts for your partner without ever asking them what they like, and wait until they show that they like one.

    • Great metaphor!
      For some website optimizer’s out there, it’s a case of the fear of the unknown. Especially when the unknown is something outside of their comfort zone – like usability for split-testers or vice-versa. The answer is to just give it a try – and these days it’s so much easier to get started – with accessible MVT software, great landing page tools etc. And even usability doesn’t have to be expensive. Hall testing or Gorilla testing with your potential end-users/customers is a fantastic way to get rich insights into what to test and how to improve your website.

  3. [...] much more effective when UX Consultants and AB/Multivariate Testing experts work closely together. Link – Trackbacks Posted in User experience (UX) | Permalink. ← Beyond Boarding Passes: [...]

  4. Leo McDevitt says:

    I’m amazed that not all companies are combining usability and A/B multivariate testing. I am usually the cheerleader-in-chief at my company for combining the two.

    We have not seen any benefit in page rank or traffic volume from usability optimization (my term for it) but we have seen a profound effect in conversions. Usability improvements may also be having a positive impact on our backlink portfolio, but we have no metrics to demonstrate this yet.

  5. [...] much more effective when UX Consultants and AB/Multivariate Testing experts work closely together. Link – Trackbacks Posted in User experience (UX) | Permalink. ← Are women paid less than [...]

  6. [...] much more effective when UX Consultants and AB/Multivariate Testing experts work closely together. Link – Trackbacks Posted in User experience (UX) | Permalink. ← An interview with [...]

  7. Glenny says:

    What’s up, just wanted to say, I liked this post. It was practical. Keep on posting!

  8. [...] Usability and A/B Testing – A Special Relationship [...]

  9. I follow ‘which test won’ quite a bit. I have to say that I don’t always agree with their ‘findings’. Often the margins are so close that the tens of thousands of dollars spent testing I would find a waste of money – but do agree that it has to be looked at.