Back to cases
Case studies
for Online Communities
With a daily audience of millions of users, it publishes about 150K reviews every day
Client numbers
3.5M
MAU
30M
Visits per month
Challenge
The platform's popularity attracts all kinds of abuse: spam, fraud, hate speech, vandalism, and more. This puts the platform's moderation system under constant stress.
On the one hand, offensive content repels users (leading them to choose other platforms) and creates legal risks for the company. On the other hand, frequent bans make the platform less informative and discourage users from visiting. Spending your time writing a valid review and then having it rejected is obviously a disappointing user experience.
With all of that in mind, the platform felt it was essential to build a consistent and transparent moderation process, while at the same time making it difficult for malicious users to trick the system.
On the one hand, offensive content repels users (leading them to choose other platforms) and creates legal risks for the company. On the other hand, frequent bans make the platform less informative and discourage users from visiting. Spending your time writing a valid review and then having it rejected is obviously a disappointing user experience.
With all of that in mind, the platform felt it was essential to build a consistent and transparent moderation process, while at the same time making it difficult for malicious users to trick the system.
Used solutions
Detection of forbidden products and services
Identify goods and services that violate the law or your platform's rules.
About solution