Are Excessively Positive Customer Reviews Hurting Your Conversion Rates? [Study]

Customer-Reviews-header-small

Customer reviews may scare some marketers because they give consumers power, but they are a key component of success on the web.

Thirty percent of users turn to Amazon to research products before buying, in large part because of the customer reviews. Only 13 percent of consumers turn to Google.

Figleaves.com was able to boost conversions by 35 percent by adding customer reviews. eSpares.co.uk increased conversions by 14.2 percent. Walker Sands boosted a client’s conversion rate from 9.5 to 10.5 percent. Need I go on?

Nothing breeds trust quite the way an unfiltered customer review stream does, but that doesn’t mean you should just add a template review form to your site. We need to understand how and why these reviews are influencing sales and we need to organize them in a way that will optimize conversions.

To do that, we need data-driven strategies. The good news? I’ve got some for you.

How customer reviews impact conversions

To investigate how reviews influence conversions, a team of research professors turned to the obvious place: Amazon. The results of the experiment were published in the Journal of Marketing.

The team pulled data on Amazon conversions for 591 books with 18,682 customer reviews. To avoid confusing cause and effect, they collected data on other factors that could influence conversions, such as price, review volume, genre and advertising (which was found to be so insignificant in this case that it was disregarded).

Then they employed text analysis tools to evaluate two properties of the reviews:

  • The positive or negative emotional tone of the review (more than 100 studies have demonstrated that this process results in outcomes similar to those of human raters)
  • The linguistic style of the review. They did this by evaluating how similar each review was to other reviews for books in the same genre, based on how often certain words were used. Previous research has shown that when people share common jargon and slang, they develop a shared sense of identity and trust each other more, no matter how little they actually know about each other.

Not surprisingly, the researchers found that negative reviews had a stronger impact on sales than positive reviews did. But what’s interesting is that the sheer number of reviews, regardless of what they discussed or how favorable they were, had a positive impact on sales.

This effect has been backed up by Reevo (and many others):

Customer Reviews Impact Conversions

The study also found that positive star ratings didn’t influence sales. Only the content of the reviews impacted conversions. This might be because everybody has a different definition of what counts as a 5 or 0 star product, and the review itself contains the information to make that judgment for yourself.

But here’s the thing…

While the star ratings themselves didn’t influence sales, the variability in star ratings positively influenced sales.

In other words, if a visitor sees nothing but 5 star reviews, they get suspicious.

Keep in mind that this is on Amazon, a trusted brand where the default assumption is that the customer reviews are real. This skepticism can only get worse if the user reviews are on a platform they’ve never seen before.

In short, while a plethora of negative reviews is going to sink your product, a collection of excessively happy customer reviews will have your visitors crying “Fake!”

There’s more to it than positive and negative

As we mentioned earlier, the study looked at not just the emotional tone of the reviews, but their linguistic style, which did, in fact, make a difference.

Again, this comes down to jargon and slang. Readers will probably find a sci-fi reviewer more trustworthy if he references “the uncanny valley” or “steampunk.” A self-help reviewer might be taken more seriously if they mention “affirmations” or “Stephen Covey.”

Younger readers will probably respond better to something that’s “epic,” as opposed to “cool.”

This is where things can start to get a bit more complicated. How do you measure “linguistic style” without advanced text analysis software? Does it even matter to a marketer?

Yes, it does. And there are a surprising number of ways that you can influence it:

  • Including editorial reviews that match the desired linguistic style
  • Calling out customer reviews that match the desired linguistic style
  • Using the appropriate linguistic style within the product descriptions and landing page
  • Including brief reviewer guidelines that can direct the reviewer’s writing style

Identifying the desired linguistic style to begin with is more difficult, and without text analysis tools it’s impossible to pull off quantitatively. Still, simply being aware of its existence should give you an incentive to test with:

  • Heatmap and behavioral data – linked to specific users if possible – that indicates which reviews have the strongest impact on conversion rates
  • Survey forms like Qualaroo, which could ask for feedback on the linguistic style, with questions like “How did you feel about the use of jargon on this page?”

There’s one more major reason you should care about linguistic style. If a review is positive and has the appropriate linguistic style, the impact on conversions is even stronger:

Customer Reviews and Linguistic Style

This graph comes from the paper. The vertical axis represents the change in conversion rate. The horizontal axis represents the positive or negative emotional tone of the review, and LSM stands for “linguistic style matching.”

As you can see, the dotted line representing “LSM high” raises above the others as the positive tone of the message gets stronger.

In other words, the better a match the linguistic style, the more tolerable consumers are of excessively positive reviews.

Sorting reviews by most recent? Bad idea.

Most people aren’t going to read all of the reviews, just the first few of them. If the reviews are organized by something as arbitrary as how recent they are, you’re going to miss out on some serious conversion opportunities.

So, how should you organize the posts? Let’s start with what not to do:

Don’t sort by most positive rating. First off, remember that the star rating itself doesn’t actually influence sales; only the content of the review matters. Second, remember that a wider range of star values actually increases conversions, despite the fact that this means more negative reviews are visible.

Instead, you should consider testing these and other alternatives:

  • Allow users to rate each review, and then sort them by most helpful. The study found that the most helpful reviews on Amazon were also the most likely to use the appropriate linguistic style.
  • Show users a representative sample of the variability in star ratings. In other words, if 50 percent of the reviews have 5 stars, show them a list of 10 reviews, 5 of them with 5 star ratings.
  • Experiment with showing the most positive reviews from each star rating. In other words, show the 1, 2, and 3 star ratings whose actual content reflects the product most positively.
  • Experiment with showing the best linguistic style for each star rating.
  • If you have the resources to code this solution, try presenting the reviews in random order, measuring how each influences conversion rates, and then sorting the reviews by strongest impact on conversions.

If I had to choose an option without running any tests, I’d probably run with some combination of most helpful and most representative.

Here’s what I mean: it’s possible that if you sorted exclusively by most helpful, you’d end up with too many positive reviews and users would become skeptical. Including some negative reviews to show a wider range of opinions may actually boost conversions.

But again, as always, you’ll never know without testing.

– Pratik Dholakiya


About The Author

Photo of Pratik Dholakiya

Pratik Dholakiya is the Co-Founder & VP of Marketing of E2M Solutions & Only Design. The primary focus of E2M Solutions is on content marketing and leveraging its potential to generate revenue for clients. OnlyDesign helps companies build websites that convert. Hit him up on Twitter for a quick chat.
» More blog posts by

Comments

  1. Raquel Hirsch says:

    What a fantastic article! Thank you (I am afraid to sound too positive, in fear of hurting the post’s credibiltiy… but there are so many good ideas and test hypotheses here that I cannot help myself :>)

  2. Jen says:

    Wow, thank you for sharing all this – it’s insanely useful and applicable information.

    I found it fascinating that only the content of the positive reviews, and not the positive star rating itself, impacted conversions. Websites that allow visitors to rate a product without leaving a review are really doing themselves a disservice by missing out on such a valuable and simple conversion booster.

  3. Miles says:

    Cool! … I mean Epic. I’m showing my age.
    Great article. Reviews are now also important from a PPC perspective as Google Adwords allows you to utilize these in your ads.

  4. Scott Graham says:

    Hmmm… The more positive reviews, the more people will buy.. the more people that will undoubtedly review. Negative reviews… less people buy… less people review. Their study treats variables like they are independent when they are, in fact, interdependent.

    • Not true. They used regression analysis to control for these interactions.

      Why does every post on a study have to have a backseat statistician in the comment section?

  5. Great post! I disagree with the sorting recommendation for reviews though. Sorting by most recent is extremely important for SEO purposes and can have significant impact on search rankings and dynamic meta descriptions. Additionally, sorting by “most helpful” generally results in “most helpful” reviews perpetually being up-rated by others as being most helpful minimizing the opportunity for new, more fresh reviews to be rated as helpful. The best way to work around this for both SEO and CRO, is to sort by most recent and then feature ~2 most helpful ratings above the standard feed. For folks using Bazaarvoice, that platform has an awesome ‘review summary” feature that visually depicts this info and dynamically summarizes the top, helpful, and low reviews for easy user access. See The North Face as an example: http://www.thenorthface.com/catalog/sc-gear/womens-jackets-vests/women-39-s-parkway-jacket_2.html

  6. Bilal Ahmad says:

    Some very important points are discussed in this post. I really think that it is important for post to have mixed reviewes. Customers always look for negative reveiws before buying a product.

  7. As long as everything is or looks natural… . The number of overall ratings and the ratio between good and bad reviews is the key here. The balance between these two should be in my opinion 8:2 (good vs bad). The positive ratings should vary but not include very negative points included in a POSITIVE rating as it affects user even more. Basically it makes user dis-value positive reviews even more and focus his attention a little bit too long on bad reviews only.

  8. Benoit says:

    Please note that a French company, AFNOR Certification, published some rules about the apps and website certification publishing reviews, based on a standard NF Z74-501.
    All documents available in English.

  9. Henrik says:

    Fantastic post!

    This post highlights a very important issue for many major companies around the world – using “free-bees” for their reviews. Spending a little money and analytical power can have huge impact on sales.