56 results for “topic:nsfw-classifier”
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
Keras implementation of the Yahoo Open-NSFW model
A .NET image and video classifier used to identify explicit/pornographic content written in C#.
NudeNet: NSFW Object Detection for TFJS and NodeJS
✅ CODAR is a framework built using PyTorch to analyze post (Text+Media) and predict cyber bullying and offensive content.
轻量级NSFW识别方案,支持私有部署和HTTP API调用。
Anti Spam/NSFW Telegram Bot Written In Python With Pyrogram.
An NSFW Image Classification REST API for effortless Content Moderation built with Node.js, Tensorflow, and Parse Server
Rest API Written In Python To Classify NSFW Images.
[Android] NSFW(Nude Content) Detector using Firebase AutoML and TensorFlow Lite
A JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.
This repo contains Deep learning Implementation for identifying NSFW images.
SafeVision is a professional Python script designed to detect and blur nudity in both videos and images.
This repository is dedicated for building a classifier to detect NSFW Images & Videos.
An NSFW Image Classifier including an Automation Engine for fast deletion & moderation built with Node.js, TensorFlow, and Parse Server
Group Guardian is a Telegram bot for admins to maintain a safe community.
A tool for detecting viruses and NSFW material in WARC files
NSFW画像検出モデル(open_nsfw_android)をColaboratory上で動かすサンプル
There are many common vision tasks that were resolve by use On-device machine learning.
Simple drop in API to determine if image is NSFW using TensorFlow
Containerized self-hosted REST API for vision classification, utilizing Hugging Face transformers.
This project is a TypeScript application that uses the OpenAI Moderation API for image classification. It's designed to detect and handle NSFW (Not Safe For Work) images uploaded to an S3 bucket.
Python package to apply the Safety Checker from Stable Diffusion.
Modular NSFW Text Classification Pipeline A fully modular text classification system built with Python and scikit-learn. Designed to scrape, clean, label, train, and evaluate NSFW/SFW text datasets using TF-IDF and Logistic Regression. Built as a learning project with real-world structure and scalability in mind.
Anti-NSFW Project in python using pre-trained model.
nsfw-image-detection is a vision-language encoder model fine-tuned from siglip2-base-patch16-256 for multi-class image classification. Built on the SiglipForImageClassification architecture, the model is trained to identify and categorize content types in images, especially for explicit, suggestive, or safe media filtering.
Application that allows you read dump file from PornHub for extracting files & tags for creating AI models
NSFW.js implementation for image, gif and video. NSFW detection on the client-side via TensorFlow.js
An internal hackathon for Kavach 23