Research Brief

To Spot Fake Online Reviews, Target the Reviewers

A test was 93% accurate; more efficient than analyzing reviews

Anyone who shops online comes to rely on customer reviews to help separate the solid offerings from the duds. Trouble is, too many of those ratings are fake, paid for by the seller. And it isn’t easy to tell which is which. 

Estimates of the share of fake reviews range from 4% to more than 30%. Whatever the number, it can be bad news for e-commerce companies, damaging consumer trust and hurting sales. Amazon and other online sellers try to filter out fakes, and experienced shoppers use a handful of tips for spotting them — lots of five-star reviews in a few days or eerily similar review photos.

Finding a Way to Ferret Out Fake Reviews

A working paper from UCLA Anderson’s Sherry He and Brett Hollenbeck, Rochester Institute of Technology’s Gijs Overgoor and Ali Tosyali, and USC’s Davide Proserpio proposes a solution: Forget the reviews and target the network of products and reviewers most likely to post phony ones.

Opt In to the Review Monthly Email Update.

In a previous study, He, Hollenbeck and Proserpio identified a large and thriving online marketplace where sellers hire people to buy their products and post positive reviews. The researchers singled out more than 20 private Facebook groups used by sellers on Amazon.com, documenting the products that purchased high ratings. (Similar groups exist for Walmart and Wayfair, the authors say.)

The Federal Trade Commission and other regulators are investigating this market, and the firms involved have started to take it more seriously as well. In July Amazon sued administrators of more than 10,000 of the Facebook groups. Facebook removed one of the groups that had more than 43,000 members earlier in 2022.

One feature of these markets is that a relatively small number of sellers and reviewers participate so that products with phony ratings tend to rely heavily on the same reviewers. By zeroing in on this limited product-reviewer network, the new paper suggests, online platforms can more easily identify fake reviews than the methods that analyze the text, images, metadata and other features. What’s more, it can’t be easily evaded by efforts to make fake reviews seem more realistic.

“While no method can identify fake reviews with 100% accuracy, [the researchers’ method] would allow rating platforms to apply greater scrutiny to these products, add warning flags for customers or otherwise selectively reduce their incentive to manipulate their ratings,” the authors write.

Today, detecting fake reviews mainly relies on machine learning to analyze the reviews themselves — ratings, votes from shoppers who have found the reviews helpful, and the language of the text. The problem is that these algorithms have to be trained on reviews that have already been identified as fake. The algorithms are also relatively easy to deceive with fake reviews that are indistinguishable from real ones.

Just How Accurate Is the Model at Identifying Fake Reviews?

The authors built on their earlier study of fake-review marketplaces. They gathered data from the Facebook groups from a random sample of 1,500 products that solicited fake Amazon reviews, collecting ratings, helpful votes and the text and photos in the reviews. They also pulled together similar information from more than 2,700 competing products that don’t show up in the fake-review markets.  

They then constructed a network, with products as the nodes, showing the connection between those that shared common reviewers. Next, they applied a variety of metrics, including the number of reviewers a product has in common with others in the network and the relative importance of those other products, which showed that buyers of fake reviews are densely clustered in the network.

To test the metrics’ ability to spot fake reviews in the general population of Amazon products, the researchers hand collected data on products and related reviews and reviewers. Their model identified fake reviews with 93% accuracy.

They also looked at the entire home and kitchen category — a department in the Amazon store with 65,000 products, 11 million reviews and 6.1 million reviewers. That wider sample flagged about 1,500 products as likely to be buying fake reviews, due to their close connections and clustering. The researchers then confirmed that a large share of these products were likely to be fake review buyers using their model. 

The test suggests that Amazon could run a similar analysis to see what products cluster in the same way, Hollenbeck says in an email. “If so, it would be reasonable to suspect that cluster contains fake review buyers.”

Featured Faculty

About the Research

He, S., Hollenbeck, B., Proserpio, D., & Tosyali, A. (2022). Detecting Fake Review Buyers Using Network Structure : Direct Evidence from Amazon. http://dx.doi.org/10.2139/ssrn.4147920

Related Articles

Man buying jeans Research Brief / E-commerce

Retailing Cross-Platform Win: A Pop-Up Store Boosts Online Sales

Alibaba finds a fresh twist on an old-school store is an effective marketing tool

A woman hands off cases of drinks to a male volunteer. Research Brief / Management

How Nonprofits Can Better Engage Volunteers

When to allow unpaid workers to call dibs on recurring tasks — and when not to

A laptop with Netflix Research Brief / E-commerce

Streaming Choices: Do We Follow the Big Crowd or Our Friends?

The ubiquitous community-wide customer rating is the biggest pull in helping us pick a movie or show

A family eating at the dinner table Research Brief / Politics

Holiday Meals Shortened by Political Divide

Cell phone location data and local voting records measure discord