🤝 Using large language models to seamlessly help content moderators make better decisions, faster.
-
Updated
Mar 29, 2023 - TypeScript
🤝 Using large language models to seamlessly help content moderators make better decisions, faster.
Social media application that uses Perspective API by google to filter out toxic posts. Connect with friends around the world using OctoVerse. Share your thoughts as a post or a message and follow your friends to see what they are up to.
A full-stack web application for creating and sharing short-form video content, inspired by TikTok. Features include user authentication, video upload, feed recommendations, and content moderation.
JS Client for Safelyx's API. 🛡️🛜
CLI Client for Safelyx's API. 💻🛜
Content Moderation using Reality.Eth with Kleros arbitration
AI Content Moderation Tool to detect and flag NSFW images and text.
🛡️ Profanity detection for Sinhala, English & Singlish with real-time processing, OCR support, and adjustable confidence thresholds. FastAPI + Next.js + scikit-learn.
JavaScript/TypeScript SDK for RAIL Score
Open-source, modern AI content moderation
Skywatch Automod is the public release of automoderation software used by skywatch.blue on the Bluesky Network
Humane is a real-time, privacy-first social platform built with React and TypeScript, featuring chat, video calls, and AI-powered content moderation.
Open-source ML-powered profanity filter with TensorFlow.js toxicity detection, leetspeak & Unicode obfuscation resistance. 21M+ ops/sec, 23 languages, React hooks, LRU caching. npm & PyPI.
Event-driven microservices backend for Humane, a behavior-rewarding social platform. Implements CQRS with Kafka for eventual consistency, Elasticsearch for global search, and polyglot persistence, with ML-powered moderation pipelines for content safety.
Official Node.js/TypeScript SDK for SafeNest - AI-powered child safety API for detecting bullying, grooming, and unsafe content
Official React Native SDK for SafeNest - AI-powered child safety API for detecting bullying, grooming, and unsafe content
Add a description, image, and links to the content-moderation topic page so that developers can more easily learn about it.
To associate your repository with the content-moderation topic, visit your repo's landing page and select "manage topics."