What is Azure Content Moderator?
Azure Content Moderator
In this article we will understand :
- Where it is Used?
- What does it include?
- Moderation API
- Review API
- Review Tool
- Data Privacy
Azure Content Moderator is a service that raises flags if undesirable content i.e text or images or Video content which actually may be potentially offensive, or risky. User App ca then handle flagged content in order to comply with regulations or maintain the intended environment for users.
Where it is Used?
The Azure Content Moderator is used in various scenarios some of them are:
- K-12 education solution
- Enterprise media companies
- Online marketplace
- Gaming Companies
- Social messaging platform
What does it include?
Azure Content Moderator service consists of several web service APIs available through both REST calls and a .NET SDK.
Moderation API
The Azure Content moderator service includes Moderation API, which checks content material that is potentially inappropriate or objectionable.
Image moderation | Scans text against a custom list of terms along with the built-in terms. |
Custom Image Files | Scans images against a custom list of images |
Video Moderation | cans videos for adult or racy content and returns time markers for said content. |
Custom term list | Scans text against a custom list of terms along with the built-in terms. |
Text Moderation | Scans text for offensive content, sexually explicit or suggestive content. |
Review Tool
The Azure Content Moderator also includes the Review tool. which hosts the reviews for the human moderator process. The Review tool also provides a user-friendly front end for several Content Moderator resources.
Data privacy and Security
As with all of the Azure Cognitive Services, developers using the Content Moderator service should be aware of Microsoft's policies on customer data.