April 24, 2019 by Rahul Chadha
Bob’s Red Mill Natural Foods was founded in 1978 by Bob Moore and his wife Charlee, who turned their love of healthy foods and whole grains into a business with the goal of making high-quality natural and organic foods available to as many people as possible.
Today, Bob’s Red Mill is an employee-owned company with globally distributed products and a deeply passionate following—many of the company’s customers rely on it to provide quality gluten-free products due to their sensitivities or allergies.
We spoke with Kevin Irish, Digital Marketing Manager for Bob’s Red Mill, about how the company’s new Customer-Generated Content strategy helped to capture the passion of its most ardent customers for some powerful word-of-mouth marketing.
What was the problem Bob’s Red Mill faced before you revamped your Customer-Generated Content approach?
When I started, Bob’s Red Mill sent me to several food trade shows where thousands of attendees would come to our booth. At every single show a grown person would break down in tears while telling me their story. Bob’s Red Mill had made their life—or the life of someone they loved—manageable for the first time, mostly because they could finally deal with a food allergy.
Then I looked at our website, and none of that emotion was present. None. That was my guiding hypothesis for improving our Customer-Generated Content. If we gave our customers a platform to voice their love for us and our products, we would get lots of good content. And with TurnTo’s platform, I turned out to be right.
How did TurnTo’s products help the company?
Before TurnTo, all we had was a “write a review” button—and a poor user experience for those that clicked on it. There was no review solicitation. I crunched the numbers and found out we were getting about 1.3 pieces of Customer-Generated Content per day. After we implemented TurnTo’s Ratings & Reviews and other products, that jumped to more than 115 submissions per day.
Bob’s Red Mill almost didn’t turn on TurnTo’s Checkout Comments at launch, correct?
Yes, that’s true. One of TurnTo’s customer success reps highly recommended that we include Checkout Comments in our implementation, which was the right call. It’s been a huge success for us, and a feature we never would have thought of ourselves.
Basically, Checkout Comments pops up on an order confirmation page and asks, “Why did you choose this?” We’re getting content gold from that, so much so that we rebuilt part of our website around it to include a visual pinboard of products that get responses.
We use a slightly customized API from TurnTo that lets us only show comments we think are worth displaying. It’s just pages and pages of comments raving about our products, all based on that one simple question at checkout.
Recipes are also a huge draw for the Bob’s Red Mill website, right?
Yes, about one-third of our traffic is to our recipe section. About 18 months ago a bug in the recipe section of our website took our entire site down. When we brought the site back online we completely remade the recipe platform. A large part of that was to integrate TurnTo’s reviews and Community Q&A into the recipes.
On our old recipe platform users were leaving reviews for recipes that were actually questions. Things like, “Can I use almond milk instead of cow’s milk?” We weren’t really serving our customers’ needs.
With TurnTo’s Community Q&A customers didn’t need to leave a question in a review, they could just ask our recipe pros what would work. We soft-launched the feature without telling anybody and our first organic question came in 30 minutes later. We had 260 questions in the first month.
TurnTo even went back to the old questions that customers left in reviews and paired them with answers. Then they imported that content into the new Community Q&A feature so we wouldn’t lose all of that historical information.
Anything else to add?
I’ve worked with other “top players” for Customer-Generated Content collection in the past and was always really disappointed. They nickel and dimed us, lacked support and weren’t open to any changes to personalize or customize their platforms.
TurnTo met all of those basic needs and then offered us even more features. We love how much of the platform is open via API, and how well documented it is. It’s been a perfect partnership.
To learn more about how TurnTo helped Bob’s Red Mill improve their Customer-Generated Content strategy, see our recent case study with the company. You can also watch a presentation Kevin Irish gave on the benefits of TurnTo at Shop.org 2018.
Want to find out how TurnTo can help you?
November 13, 2017 by George Eberstadt
A version of this article was originally published by Total Retail on October 5, 2017.
The Association for Psychological Science recently published an interesting study on consumer shopping behavior, showing that when two comparable products have similar average ratings, shoppers are significantly more likely to choose the product with the larger number of ratings.
This finding won’t surprise e-commerce retailers, but in the psychology world, it’s an illustration of herd mentality leading to irrational decisions. When two comparable items have low ratings, it would be more logical for shoppers to pick the one with FEWER total reviews, as it’s possible that the poor average rating is a fluke — an unrepresentative sample of grumpy reviewers. An item which has a large number of low ratings, on the other hand, is very likely to actually be a dud. Yet even in these cases, where both choices are poorly rated, shoppers prefer the one with the larger number of reviews, because a high review count signals popularity, and people tend to buy what’s perceived as popular.
The e-commerce implications of this study are clear — retailers need to signal to shoppers that the items they sell are popular. Travel sites do this well by showing an indication of the recency and volume of bookings. Take Hotels.com for example:
And here’s another example from Orbitz:
On retail sites, product review volume is among the most powerful ways to signal popularity. While influencing review volume may feel difficult because it seems there are only so many people that want to write product reviews, there are actually many strategies to increase review collection.
For starters, it’s a mistake to think that the number of product reviews that can be collected has a hard limit based on the willingness of customers to write them. It’s more like drilling for oil — some comes out in a gusher, but there’s a lot more in the ground that you can get out by using clever techniques. To illustrate, we recently had a customer begin sending follow-up “please review your purchase” emails a few days after their first request. The retailer expected that the follow-up would get a much lower response rate than the initial email, believing that most customers motivated to write a review would respond to the first request. Surprisingly, the response rate to the follow-up email was 80 percent of that of the initial email — nearly the same. This showed that many of those customers that didn’t respond to the first email had no aversion to writing a review, they just happened to get the request at the wrong moment.
Since many customers ignore a request to write a review for reasons of convenience rather than intent, retailers can increase review collection simply by taking the friction out of the collection process. Strategies aimed at motivating review writing (e.g., incentives) can help, too, but they can also have side effects (e.g., reduced trust). Reducing friction is the low-hanging fruit. Technology that enables customers to write and submit reviews from inside an email rather than requiring a clickthrough to a web form can more than double submission rates. A simple change from a button that says “Click to write a review” to a display of five stars with the message, “Start by rating it” can add 50 percent to the response rate. Allowing users to write reviews before requiring authentication, rather than leading with a log-in demand, can double collection rates.
On mobile devices, allowing photos to be submitted without first requiring the user to author a review can multiply visual content collection up to four times. Asking a user who has just submitted a review to review other items they’ve purchased is five times more likely to produce an additional review than the initial email. It’s common for this “Do More” technique to increase total review volume by 50 percent to 100 percent.
The lessons are clear: Increasing review volume can have a major impact on sales by tapping into the popularity effect, and review volume can reliably be increased with the proper tools and techniques.
October 16, 2015 by George Eberstadt
You probably know that sending an email post-purchase to request a product review is critical to getting a healthy volume of reviews. But you may not know that it’s also essential for ensuring that the sentiment of the reviews you collect fairly represents the sentiment of your customer base, overall.
Here’s an example from Jockey.com. After switching to TurnTo for ratings & reviews, there was a period of 6 weeks when they were not sending out review solicitation emails (RSEs); the only reviews they collected were from shoppers who returned to their site, on their own, to submit one. Then Jockey turned on the RSEs. Not surprisingly, the volume of reviews they collected increased by 7X.
But here was the surprise: the average rating also improved – by over half a star, from under 3.8 to over 4.3! That’s a huge improvement, with the critical benefit of accurately signaling to shoppers the high quality of Jockey products.
Why the improvement in average star ratings? It turns out that the people who go through the effort to come back to your site to write a review, without being prompted to do so, are disproportionately the unhappy ones – the ones with a complaint to vent. So if you are only capturing reviews from this group, you are over-representing the negative sentiment in your customer base and under-representing your happy customers. When you reduce the barrier to writing reviews by sending customers an email requesting one, you get a review-writing population that is much more representative of the overall sentiment of your customer base. In the case of Jockey, the before-and-after gain of over a half-star across their full catalog is the kind of improvement you might otherwise have to do a product-line refresh to achieve.
So in case the benefits of a much greater volume of reviews aren’t enough to convince you to send out a review solicitation email, keep in mind that you’ll be more accurately showing the positive sentiment of your customer base, too!
September 8, 2015 by John Swords
… And we mean LOOK different.
- 90% of reviews come in response to emails
- >60% of emails are opened on phones
- Phones are bad for long text (like reviews)
- Phones are great for photos!
The implications are clear:
- Your strategy for collecting customer reviews needs to work on phones
- On phones, the strategy should be “visual first.”
So what is a visual review? It’s a photo (or video) submitted by a customer in response to a request for a review – the proverbial picture that is worth a thousand words. Instead of text stating, “With my new cookware, I was finally able to perfectly brown the crust of my famous chicken-pot-pie,” it is a photo of that perfect chicken-pot-pie.
Instead of text stating, “The shirt fit perfectly, with no extra blousing around my waist,” it is a selfie of the customer looking great in her new shirt.
Instead of text stating, “The fabric on the sofa was gorgeous, but the cushions were way too saggy,” it is a photo of the sofa with its gorgeous fabric and saggy cushions.
Far from being yet another “gotta-keep-up-with-changing-platforms chore,” the shift to visual content that the rise of smart phones demands creates a huge opportunity. Simply put, visual content converts better. Few shoppers have the patience to read the full body of customer reviews, and those that read any rarely go past the first couple of entries. So while having lots of reviews is valuable for signaling that an item is popular, most of the text you are collecting has little impact on conversion. On the other hand, shoppers can scan an image gallery in a blink and come away with a powerful, visceral sense of the appeal of a product.
This is not to say that you should abandon collecting text reviews; there is plenty of information in text reviews that images can’t convey. If a customer is on a desktop when they get your request to write a review, you should lead with the request for a standard text review (with an option to attach an image). But when the customer is on a mobile device, don’t try to force a round peg into a square hole by asking for text. Instead, ask the customer to do what comes more naturally on these devices and submit an image.
The applications are broad and go way beyond selfies. Image subjects can vary such as:
- Things made with the product (cooking, crafts, do-it-yourself projects)
- The product in use (home furnishings, hobby items)
- Unboxing and explainers (electronics, fashion)
- Travel (Hotel rooms, attractions)
- And yes, selfies (fashion, beauty, sporting goods)
Visual reviews are a great complement to imagery you can gather from social media sites, if you’ve taken that approach. But visual reviews also have some important advantages over social media harvesting and may be all the visual content collection you need:
- Images are automatically connected to the relevant SKU (saving a lot of work)
- Usage rights are automatically acquired
- You can collect a lot more images, since there is a big portion of your customer base that is happy to write a review but isn’t going to post your product to their Instagram page.
- The image collection is continuous; there’s no need for special hashtag campaigns
So as we said, the next generation of product reviews is going to look very different.
September 24, 2012 by George Eberstadt
If you know customer reviews, you know that half of the value – maybe more – is in the insights you can extract. So you might think the same is true for Social Q&A, since these are the two main sources of user-generated content on product detail pages. But you’d be mistaken. For Social Q&A, engagement is the key, which means that if your Social Q&A system isn’t delivering massive customer interaction, it’s falling short.
In a recent talk I gave to a gathering of e-commerce execs from major brands and retailers, I asked the audience for a show of hands on this: if they were forced to turn off part of their customer review system, which part would they chose? The options were:
- Turn off the back end. Visitors to their sites and storefronts could see all the reviews, as could search engines, but all the analytics would be gone.
- Turn off the front end. All the analytics would be available, but none of the content would be visible to shoppers or search engines.
The room split exactly in half.
At the Shop.org Summit last week in Denver, the CMO of a fashion brand told me he had just run a rigorous A/B test on their customer reviews. He was new to the brand, and even though they’d had reviews for a while, he didn’t want to just assume it was working. He tested the overall, site-wide effect on conversion (not just whether items with reviews did better than items without, or whether high-scoring items sold better than low scoring items). His discovery: negative lift! Overall, sales dropped a bit when reviews were turned on. So I asked if he was going to turn reviews off. He said that hadn’t been decided; the insight value they got from reviews was important enough that they would probably keep them after all. (There’s neat recent story on how stores are using the insights from customer reviews to steer their businesses in the Wall Street Journal.) n.b. Fashion brands seem to have a stormier relationship with customer reviews than many other retail segments. Your mileage may differ…
If you have had this sort of experience with customer reviews, you might think that the value equation is about the same for Social Q&A. But it’s not. While Social Q&A can also deliver valuable insights, it is first-and-foremost an engagement tool. You are not going to make up for poor Q&A engagement with analytics.
To put it simply: an unanswered question is a real downer, whereas no one ever knows about the review that was never written. Unanswered questions on your product detail page scream “nobody home”. First, there are the disappointed shoppers who asked questions and never heard back. Then there are the shoppers who come later and see all the unanswered questions stacked up. Sure, you can hide unanswered questions, but that makes it even less likely they get answered, and it doesn’t help the person who asked. You can have your staff answer all the questions, but then you’re probably better off with a live chat approach, and you’re missing out on all the benefits of getting your real customers to interact with your shoppers. In short, if your Social Q&A system doesn’t quickly and reliably get lots of customer answers to shopper questions, you’re probably better off not inviting shoppers to ask. It’s better not to create expectations if you’re not going to be able to fulfill them.
On the other hand, if you get Social Q&A right, the massive customer engagement it generates effectively drives top-line growth. One fashion merchant that uses TurnTo for Social Q&A sees 1100% conversion lift from those who ask questions or read dialog from others. And it’s not an isolated effect – about 25% of their orders come from shoppers who interact with Q&A before purchasing.
Further, there are the SEO benefits; Social Q&A done right produces 2-4 times as much user-generated content (UGC) as customer reviews, which is great for driving organic search traffic. If your Social Q&A system is not delivering enough customer engagement to produce UGC at scale, it’s under-performing.
So the next time someone tells you that engagement isn’t important for Social Q&A – that it’s the analytics that matter, just like for customer reviews – start by asking what sort of customer engagement their Q&A system produces.
August 27, 2012 by George Eberstadt
David Streitfeld of the New York Times has been looking hard at the issue of fake customer reviews. A year ago, he called out freelancers offering to write positive reviews for a few bucks. In January, he wrote up a service called VIP Deals that offered rebates in return for positive reviews. And a couple days ago he published a piece on Todd Rutherford, the founder of gettingbookreviews.com, who sold 4,531 book reviews in 2010 and 2011 at $20-99 each, before backlash from Google and Amazon forced him to shut down the service.
No doubt, there is fakery out there. The question is: how widespread is it? 4,531 seems like a lot of reviews, but it’s a small part of the billions of customer reviews available on the web. The most recent NY Times piece says Bing Liu, a computer scientist at the University of Illinois, Chicago specializing in automated text analysis, “estimates that about one-third of all consumer reviews on the Internet are fake.”
One of the reasons there is so much suspicion of reviews is that so many of them are positive. Liu has estimated that 60% of the reviews on Amazon are 5 stars and another 20% are 4 stars, but, he says, “almost no one wants to write five-star reviews, so many of them have to be created.”
Here’s an alternative explanation: there’s a lot more customer satisfaction out there than you might guess. That’s a conclusion you might come to from reading customer answers to shopper questions gathered through TurnTo. With the TurnTo system, there is almost no possibility for fakery. When a shopper asks a question, the system chooses a group of people who actually bought the item (based on the transaction records of the store) and emails the question to them. While the system allows in-line answers on the product detail page, >90% of the answers come in reply to this question email. So unless there’s a big population of people buying products they don’t need for the purpose of providing artificially positive answers to shopper questions they may never even receive, these answers are legit.
And one of the most striking aspects of the answers provided by these real product owners is how effusively positive they often are. For example, here are some customer answers to a shopper question about the height of the drip spout on an espresso machine at SeattleCoffeeGear.com. The question doesn’t ask for any sort of overall evaluation of the machine – it’s just looking for a measurement. Yet many of the respondents (including me) spontaneously volunteered our enthusiasm for the product. (You can find this page here.)
Now I don’t want to be Pollyanna about the problem of fake reviews. I suspect they are much more common on destination review sites like Yelp and Trip Advisor where anyone can submit than on ecommerce sites where the ability to verify purchase is an easy and effective way to police. It’s also harder to believe the uniformly high ratings sometimes found on products which are judged on subjective personal taste, like books and food; personal tastes differ too much. As Streifeld points out, even The Great Gatsby (which, first published in 1925, is presumably not attracting many fake reviews) has plenty of neutral and negative reviews (>300 reviews are 1, 2, or 3 stars out of 1,400 total at Amazon).
But while the battle goes on between the fakers and those trying to root them out, it’s possible that in many cases when the reviews are positive, customers might just be happy.
July 28, 2010 by George Eberstadt
Amazon has just hooked up with Facebook to add social shopping features powered by the shopper’s Facebook friends list. (NY Times article. WSJ article.) My guess is that this will prove to be the watershed moment for social commerce. Where Amazon leads, others follow. Amazon pioneered customer ratings and reviews, which are now found on commerce sites across the web. Amazon pioneered community cross-sell tools (“customers who looked at this also looked at that”, “customers who bought this also bought that”), which are now provided to online merchants by at least half a dozen vendors. And while Amazon may not have pioneered the integration of 3rd-party social graphs into online stores (we’ve been at this for a few years), the ecommerce world is likely to take its cues from Amazon in this area, too.
Here’s what it looks like on my Amazon profile page:
And this is just their initial feature set; lots more must be just around the corner. Merchants interested in the potential of “on-site social commerce” should check out what Amazon has done here and keep an eye on where they go next.
(For those interested in archeology: before Facebook built the one-social-graph-to-rule-them-all, Amazon had social-graph-building aspirations of their own. They called it “Amazon Friends and Interesting People.” Dig here to learn more.)