Media Telecommunication

Could AI Be the Future of Fake News and Product Reviews?

When Hillary Clinton’s new book What Happened debuted on Amazon’s Web site last month, the response was incredible. So incredible, that of the 1,600 reviews posted on the book’s Amazon page in just a few hours, the company soon deleted 900 it suspected of being bogus: written by people who said they loved or hated the book, but had neither purchased nor likely even read it.

Fake product reviews—prompted by payola or more nefarious motives—are nothing new, but they are set to become a bigger problem as tricksters find new ways of automating online misinformation campaigns launched to sway public opinion. Amazon has deleted nearly 1,200 reviews of What Happened since it debuted on September 12, according to ReviewMeta , a watchdog site that analyzes consumer feedback for products sold on Amazon.com. ReviewMeta gained some notoriety last year when, after evaluating seven million appraisals across Amazon, it called out the online retailer for allowing “incentivized” reviews by people paid to write five-star product endorsements. Amazon later revised its Community Guidelines to ban incentivized reviews.

How can help with fake reviews?

Amazon’s deletions of so many appraisals for Clinton’s book caught ReviewMeta’s attention. The site gathers publicly available data on Amazon, including the number of stars a product receives, whether the writer is a verified buyer of the product and how active that person is on the site. Tommy Noonan, a programmer who founded ReviewMeta in May 2016, refrains from calling these reviews “fake,” given how politically loaded that term has become in the past year. Noonan prefers the term “unnatural.” 

“There is no way to say with 100 percent certainty that a particular review is fake,” he explains. Fortunately, only a handful of items sold on Amazon’s site have had review integrity problems comparable with What Happened. And those items were “mostly Clinton books—although there was also a problem with a [Donald] Trump Christmas ornament” that received an unusually large number of negative critiques, Noonan adds.

Not an easy task

It is not as it easy as it might sound to churn out enough deceptive reviews to influence a product or service’s reputation on Amazon, Yelp or any other commerce site that relies heavily on consumer appraisals. Unlike fake news stories that someone writes and then tries to spread virally through social media, artificial reviews work only if they are manufactured in volume and posted to sites where a particular item is sold or advertised. They also need to be reasonably believable—although proper spelling and punctuation seem to be optional. […]

  1. SwissCognitive

    Could AI Be the Future of Fake News and Product Reviews?
    #Artificial_Intelligence
    https://t.co/wPnaDy9dRK

  2. CHAZERAND Patrice

    RT @SwissCognitive: Could AI Be the Future of Fake News and Product Reviews?
    #Artificial_Intelligence
    https://t.co/wPnaDy9dRK

  3. Andy Pardoe

    RT @SwissCognitive: Could AI Be the Future of Fake News and Product Reviews?
    #Artificial_Intelligence
    https://t.co/wPnaDy9dRK

  4. Brasilian

    RT @SwissCognitive: Could AI Be the Future of Fake News and Product Reviews?
    #Artificial_Intelligence
    https://t.co/wPnaDy9dRK

  5. Dalith Steiger

    Could AI Be the Future of Fake News and Product Reviews?
    #Artificial_Intelligence
    https://t.co/Hq0b9dQJS8

  6. Andy Fitze

    Could AI Be the Future of Fake News and Product Reviews?
    #Artificial_Intelligence
    https://t.co/PqmC5XYWFT

Leave a Reply