TripAdvisor, the travel platform trusted by millions of consumers around the world for their traveler comments, today published its first Comment Transparency Report, which reveals details never before shared about its comment moderation processes, as well as Statistical data on the volume of false review attempts directed to the platform in 2018.
The report, which analyzed the full-year data on reviews submitted by the global travel community, revealed the overall volume of reviews submitted to TripAdvisor in 2018. It details the multi-layered approach that TripAdvisor adopts to ensure reviews published on Your pages meet the site guidelines. The report also provides a detailed breakdown of the proportion of reviews that were blocked or deleted.
«Ensuring that TripAdvisor is a reliable platform for our users and listed companies is a priority. We continue to advance our industry-leading fraud detection efforts in recent years, but it is a daily battle and we are far from complacent,» he said. Becky Foley, senior director of trust and security at TripAdvisor. «While we win the fight against fake reviews on TripAdvisor, we can only protect our corner of the Internet. As long as other review platforms are not taking aggressive measures, scammers will continue to exploit and extort small businesses for cash.» It’s time for other platforms such as Google and Facebook to come to the plate to join us to address this problem head on. «
Key Results of the Report
The global travel community sent 66 million comments to TripAdvisor in 2018. All were analyzed using advanced fraud detection technology, and 2.7 million were subject to further human evaluation by content moderators.
4.7% of all review submissions were rejected or removed by TripAdvisor’s advanced analysis technology or manually by the content moderation team. There are several reasons why TripAdvisor rejects or eliminates reviews, ranging from violations of guidelines to cases of revision fraud.
Only a small fraction of all review submissions (2.1%) were determined to be fraudulent, and the vast majority (73%) were blocked before they were published.
This amounted to more than 1 million false reviews that were not shown on TripAdvisor.
Less than 1% of reviews were flagged by users or companies for potentially violating TripAdvisor’s guidelines. The TripAdvisor content moderation team reviewed most of these community reports within six hours of submission.
34,643 companies were subject to a classification penalty, which is a reduction in the position of a property within the popularity or traveler classification. Classification penalties apply when a company is caught trying to post false reviews.
The report also described the efforts of TripAdvisor to catch paid reviewers. Paid reviewers are individuals or companies that try to sell «user» reviews to companies listed on the site. Since 2015, TripAdvisor has stopped the activity of more than 75 websites that were caught trying to sell reviews, including a person who was sentenced to nine months in prison by the Lecce Criminal Court in Italy last year.
«Consumer reviews have become essential for millions of tourism activities around the world. It is progress, as long as, as recently recommended by the World Tourism Ethics Committee, these reviews are reliable and unbiased,» said Pascal Lamy, President of the World Committee. on tourism ethics at the UNWTO (World Tourism Organization). «Therefore, it is encouraging to see that platforms like TripAdvisor are committed to transparency in the way they monitor, control and manage the reviews they receive to prevent abuse and fraud».