Case Studies for Online Communities
With a daily audience of millions of users, it publishes about 150K reviews every day.
MAU
Visits per month
Challenge
The platform's popularity attracts all kinds of abuse: spam, fraud, hate speech, vandalism, and more. This puts the platform's moderation system under constant stress.
On the one hand, offensive content repels users (leading them to choose other platforms) and creates legal risks for the company. On the other hand, frequent bans make the platform less informative and discourage users from visiting. Spending your time writing a valid review and then having it rejected is obviously a disappointing user experience.
With all of that in mind, the platform felt it was essential to build a consistent and transparent moderation process, while at the same time making it difficult for malicious users to trick the system.
Used solutions
Moderating images
Duplicate, transfer page to another user, change site structure, delete
About the solution →Deduplication of offers
For multi-page projects or for navigation within one page
About the solution →Validating categories
How to add a link to a certain spot in the project
About the solution →Making sure customer reviews are friendly and nontoxic
About the solution →Enrichment of product parameters
About the solution →
Results
The system scales automatically and handles spam attacks.
More than 30,000 inappropriate reviews are prevented from publishing daily.
The cost of moderation per review dropped by 50%.
The number of complaints to support reduced by 35%.
Average moderation speed increased by 70%.
Moderation accuracy is constantly at > 97%.
Thanks to fast and highly precise moderation, the service was able to maintain a healthy community and provide a safe platform for communicating on a variety of social issues. At the same time, they were able to operate in more than 200 cities and 3 culturally diverse countries without fear of offending any user or letting anyone misuse the platform.