Continuing our investigation into claims what reviews can do for your eCommerce brand, I want to take a look at conversion rates. Just like SEO (if you missed these articles, you can check here), there is unshakable mantra being repeated by every review platform under the sun, that reviews will sky rocket your conversion rate to make you believe thats the only thing you should be focusing on and pay for the most expensive plan possible. Sarcasm aside, with all repeated statements, there is always a certain amount of truth, but it's always better for you and your budget to know where it is instead of blindly following “best advice” from vendors that are incentivized to tell you these “truths”.
Conventional Wisdom about Review Conversion Rate
Let's take an example claim from a Google first page ranked article that is not affiliated with any review platform to see the conventional wisdom and ways we can verify it.

Here, the standard claim is that the conversion rate is doubled if you have just some quality reviews and show them to the right customer. However, this is very dependent on a brand-by-brand basis, and review quality is highly ambiguous to verify. Let's look at the next claim.

Before making a decision, you know already that every person wants to get some social proof; this is just our human nature, just because we usually don’t have perfect information about the product and we trust other people's experience more than marketing claims by the seller. Having said that, it's another claim that, while obvious, is very difficult to measure.

https://www.powerreviews.com/blog/review-volume-conversion-impact/
Finally, we got a claim from PowerReviews that we can test! In fact, they mention in the article that this is a data-based chart from their platform. Let’s see what our data says because in fact, we collect it as well for a number of customers.
Test Conversion Rate by Review Count with Real-life eCommerce Data
For this test specifically, we use actual eCommerce brand Clickstream data (non sampled) collected using Snowplow event tracker and order data we receive to StackTome directly.
The logic is simple - we count:
- Number of visits to product page
- Number of orders done for that product
- Review count on the given day for the given product
- Group products by review count 4 cohorts 0-10, 10-50, 50-100, 100+
- Calculate the conversion rate per day per each product with a given review count
We use a dataset of 1 year that has 500K unique visits to product pages and 150K orders for 200 products. So, the sample size is big enough to see if there is any effect from review counts.
Here are our results:

However the results should still be taken with a grain of salt because the conversion rate can be affected by other factors that might skew the results. For example, some products will have a higher conversion rate during the holiday season due to their affinity with the occasion or if they are heavily discounted. So to truly measure the rating impact conversion rate, they would need to spend some time picking products that wouldn’t have that skew, but it's outside the scope of this article. Having said that, we can see some indication that rating might have an impact, and let's see how much using our real-life e-commerce data vs example articles’ claim chart.

So, as you can see, the Conversion rate improvement is less drastic than the one claimed by PowerReviews report, where 100+ reviews tend to increase the conversion rate by 40% instead of 250%. Even so, having more reviews on the product page tends to increase the conversion rate. One thing to note 1-10 review-count products converted better than 10-50 ones, most likely due to smaller data sample size or seasonal products.
Do customers who read reviews convert better?
Now, for the final part, we want to answer a question: Is it worth displaying reviews in general? For example, you might be wondering if it's worth all the effort of collecting reviews and extra expenses paying for review platforms' subscriptions.
The best way to answer this is to analyze visitor data, track visitors who read reviews, and then compare their metrics against those who don’t. You can set this up for free using GTM tracker as follows:

Then, once you have collected enough visitor data, you can either analyze it in a Datawarehouse like BigQuery, assuming you own your clickstream data, or in StackTome, we have a report that shows this for you as follows:

From the report above, we see that, on average, around 25% of visitors read reviews (spend 5 seconds or more looking at the review widget). From a conversion rate perspective, it doesn’t affect all reviews; however, the AOV (Average Order Value) is around 30% higher than visitors that haven’t read reviews.
This might be the case because these visitors are more engaged with your brand and are likely buying more than non-engaged ones. To say that reviews are the main cause of AOV increase, we need to do an A/B test to see whether the AOV increase persists.
Summary
So here we have analyzed the main review industry claims of conversion rate impact by reviews. Based on the data, there is likely a positive correlation between more reviews and higher conversion rates, especially on product pages, but the actual impact might not be as drastic as many review platforms will lead you to believe. Also, customers that engage more with your reviews are likely to buy more as well, even if the conversion rate stays the same. This means showing relevant reviews for customers is one of the tools you can use the next time you want to drive your brand's CRO effort.
If you want to make similar data-driven decisions for your brand to drive better results in relation to reviews, conversion rate, and more, then have a look at StackTome plans here: https://www.stacktome.com/pricing, which you can try on 14-day free trial.