Core 10: The Change Makers' Manual

Make your organisation more sustainable Learn how to create and implement an effective sustainability strategy within your organisation,

reviews. Half focused on the reviewers, the remaining six targeted the language and content of the reviews themselves. For example, fraudulent reviewers were likely to post uniformly extreme reviews designed to artificially enhance or damage a company’s reputation. On the other hand, honest reviewers adopted a more balanced approach across their posts on different products and services. Phoney reviewers were also more likely to use short-lived accounts and engage in flurries of activity before switching to a new identity, whereas genuine customers posted fewer reviews over a longer period. Other tell-tale signs that helped to expose fraudulent reviews included the following: Length: Five-star reviews are likely to be longer than one-star reviews, which in turn tend to be longer that two- or three-star reviews. However, fake reviews are likely to be shorter and less detailed. Batches of phoney reviews are also likely to contain a similar number of words. Sentiment: Some companies hire professionals to write regular reviews. Their posts are often predictably positive or negative and are not in keeping with the overall ratings for that particular product. Language: Fraudulent posts are more likely to repeat particular words or patterns of phrases. They also tend to contain more misspellings, grammatical errors, verbs, and filler words. Using these characteristics, we trained our algorithm,

M-SMOTE (modified-synthetic minority over-sampling technique), to detect fake reviews more efficiently and accurately. We repeated this approach on three more datasets – two from Amazon and one from Yelp. Each time, our model outperformed algorithms that did not use our feature engineering approach.

based on negative feedback. Crucially, three quarters said they would trust online reviews just as much as a personal recommendation from a friend, family member, or colleague. Our model offers the tantalising prospect of algorithms that are more accurate and effective when it comes to weeding out phoney feedback, leaving behind genuine reviews that deserve a level of trust. And the benefits need not stop there. Social media platforms have become key conduits for conspiracy theories and fraudulent information, much of it disseminated and shared by bots. Our novel machine learning model could help to redesign the tools used to detect the automated accounts spreading fake news and reviews. When implemented, it will serve to improve revenue-generating opportunities and customer experience for both digital platforms and businesses. In the spirit of “what we have here is a failure to communicate”, our findings suggest that online e-commerce platforms should encourage social interactions between customers and reviewers. Enabling such communication generates rich evidence of reviewers’ behaviour to enhance the detection of fake opinions. In an age of post-truth, where misinformation proliferates across the internet at such speed, what price do we place on posts we can trust?

with our part-time, online course. Leading Sustainability Transitions

Further research is required to ensure our results translate to other major online retailers, such as eBay, Tripadvisor, Walmart, and Alibaba. The characteristics we identified may also need to evolve to keep up with fraudulent reviewers if they change their behaviour to avoid detection. Nonetheless, our findings are significant, especially when you consider how much faith many online shoppers place in reviews. One survey found 91 per cent of customers were more likely to use a firm after reading a positive review, while 82 per cent would avoid businesses

Learn more about Digital Innovation and Entrepreneurship at Warwick Business School

Sustainable Development Goals (SDGs)

Warwick Business School | wbs.ac.uk

wbs.ac.uk | Warwick Business School

40

41

Made with FlippingBook Learn more on our blog