Conversion Heroes is a series of 5-question interviews with experts in the field of conversion. Subjects for discussion include landing pages, copywriting, conversion optimization, social media conversion, email marketing, organic SEO for landing pages and A/B & multivariate testing.
Today’s Conversion Hero is Paras Chopra
Paras Chopra is founder of Wingify, a company that makes Visual Website Optimizer — world’s easiest A/B testing and multivariate testing tool. His aim with Visual Website Optimizer is to bring the joy of testing and conversion rate optimization to every marketer in the world by eliminating all IT integration issues and by making it dead-simple to use. You can read his personal blog and I love split testing blog. You can also follow him on Twitter @wingify.
Many people have experienced the classic boardroom argument, where everyone thinks they have the best idea. Testing puts this type of conjecture to rest. Todays Conversion Hero is Paras Chopra from VisualWebsiteOptimizer.com who’s going to share some of his insight into the concept of comparative split testing.
1. A/B testing vs. multivariate testing (MVT)
Oli: Let’s start with a question many people seem to ask. What’s your take on when each method is most appropriate?
Paras: The eligibility criteria for each method is traffic of course. You should not attempt to do MVT if you don’t have enough traffic on the site. But assuming traffic isn’t a constraint, MVT works best when you are hyper-optimizing. That is, when your aim is to squeeze the last drop of conversion rate juice from your existing design. On the other hand, A/B testing should be used if you want to test completely different designs and ideas. Ideally, an organization should do lots of MVT tests followed by a few large A/B tests.
Oli: What factors influence the decision when choosing a methodology?
Paras: As I said, traffic, number of design resources available and objectives of testing are main factors which influence the decision. MVT typically requires less design resources as compared to large scale A/B test changes. Moreover, as I said, if the objective is to optimizing existing design MVT (or single element change) is way to go. But if you want to do radical changes on the page (say layout change, theme change, etc.) you will go with A/B testing.
2. Big bang vs. iteration
Oli: People often struggle when they start testing with whether to change their entire page, to test a new experience, or to make small-step changes to a single element. What are the pros and cons of each approach?
Paras: For the starters, I always recommend to start with small-step changes in order to truly appreciate the value of testing. Ideally, they should pick a sweet spot on their page (ideal candidates: call-to-action, headline and image) and optimize that by a simple A/B test. Only once they get the hang of the whole process, they should attempt MVT or large-scale A/B test. That’s because it is easy to get de-motivated or draw erroneous inferences if you are doing a complex experiment without much previous experience.
3. What should I test on my landing page?
Oli: What page elements should people be testing?
Paras: Though the answer to the question will depend on the specific URL and organization, here are some of my favorites:
- The King: Call-to-Action
- The Queen: Headline
- Others: Copy, image, number of form fields, number of steps in funnel, required vs. optional steps, number of elements on page, amount of text on page, layout (left vs. right kind of tests)
[Testing your pricing page] is incredibly risky and almost borders on being illegal.
Oli: Are there things that you shouldn’t test? Because the results will be insignificant?
- Pricing. It is incredibly risky and almost borders on being illegal. You should never offer the exact same service or product at different price points to different customers. There are subtleties for handling a price test so one should really think this trough before setting up such a test.
- Trivial elements on page. Theoretically every part of a page impacts the conversion rate but many elements on the page only start getting influential if you have huge traffic (and patience) while doing testing. For example, you may discover that changing a copyright notice increases the conversion rate from 1.1% to 1.115%. Such trivial changes aren’t worth testing to me (of course there are exceptions, so you should go with your judgment if you really think copyright notice impacts conversion rate).
4. For the love of testing!
Paras: The inspiration behind Visual Website Optimizer was to create world’s easiest A/B and multivariate testing software which marketers can use to optimize their websites and landing pages without needing to involve technology department. Unlike Unbounce, which is a beautiful landing page creation software, our aim with Visual Website Optimizer is to enable marketers optimize existing pages. Say you are a Marketing Manager at an E-commerce store and you are looking forward to increasing sales. You would then use our product to optimize pages such as: homepage, product catalog, product details, checkout page and perhaps some landing pages as well.
The motivation for “I love split testing” blog is to create a community around A/B and multivariate testing which discovers the potential of testing (and of course our product) by means of articles, case studies, advice, resources, etc. We now have 10+ case studies on the blog which people immensely enjoy reading and discussing.
Oli: I’m going to put you on the spot here. How much of your own dog food do you eat? In other words, do you run experiments on your own website? And what have you learned from this?
Paras: Oh yes, definitely. At any given moment, we are probably running 4-5 experiments on our website. There are miscellaneous learning’s such as embedding FREE TRIAL along side the pricing table converted much better than having it below the table (separately). Another learning for us was that having a ‘Watch a Short Video’ button on the homepage converted much better than a ‘Sign up for Free Trial’ (probably because visitors want to research the product first – nobody seems to simply sign up right after arriving on the homepage).
5. The surprising results of testing
Oli: Testing can either confirm or refute your design & messaging concepts. What are some of your favourite examples of experiments that produced surprising or unexpected results?
Paras: A recent test was very surprising – in this test it was found out that removing a secure icon from the page actually increased conversions by 400%. Another surprising result was that by simply adding a human photo on a homepage, conversion rate can be potentially doubled.
And as I said, one of the test results on our homepage goes really against the standard advice of having a ‘Signup’ button prominently features on homepage. We found that a ‘Signup’ button actually decreased eventual sign-up’s and ‘Watch a short video’ worked much better because after watching the video, visitors were sure of what they are signing up for. (We had a ‘Signup’ button on the video page, by the way).
Thanks to Paras for being our latest Conversion Hero and sharing his knowledge with Unbounce blog readers.
More Conversion Heroes
Part 1: Roberta Rosenberg on Copywriting for Landing Pages
Part 2: Dan Martell on Social Media Conversion
Part 3: Paras Chopra on Split Testing
Part 4: John Hossack on PPC
Part 5: Chris Goward on Conversion Rate Optimization
Part 6: Cindy Alvarez on Point-of-Conversion Feedback
Part 7: Tim Ash on Landing Page Optimization
Question to readers
What surprising or successful stories do you have regarding testing?