Skip to content

Latest commit

 

History

History
11 lines (6 loc) · 977 Bytes

File metadata and controls

11 lines (6 loc) · 977 Bytes

Analyze Toxicity with Perspective API

Author: Jigsaw (https://jigsaw.google.com)

Description: We’ve partnered with the Jigsaw team to build the Analyze Toxicity extension, which leverages machine learning to classify the level of toxicity, threat and profanity of your user comments. The Analyze Toxicity extension uses machine learning to classify the level of toxicity, threat, and profanity of user comments in Cloud Firestore. This extension uses Perspective API (https://perspectiveapi.com) which is trusted by platforms like the New York Times and Reddit to promote healthy dialogue online. Please check it out here.


🧩 Install this extension

To install this extension visit the repository conversationai/firestore-perspective-toxicity