A robust tool for detecting deepnudes, deepfakes, NSFW, and NSFL content with image protection capabilities.
- Deepnude Detection: Identifies AI-generated nude imagery
- Deepfake Detection: Detects face manipulation and synthetic media
- NSFW/NSFL Detection: Classifies inappropriate content
- Multi-Model Analysis: Combines multiple AI models for robust detection
- Image Perturbation: Add adversarial noise to protect against deepfake generation
- Privacy Protection: Fawkes-style cloaking to prevent facial recognition abuse
- File upload
- Image URL analysis
- Folder scanning
- Removable drive scanning
- AIorNot API
- OpenAI Vision API
- Google Gemini Vision
- Custom trained models
DeepNudeDetect/
├── backend/ # FastAPI backend with ML models
├── frontend/ # React web interface
├── chrome-extension/ # Browser extension
├── models/ # Pre-trained ML models
└── data/ # Upload and processing directories
- REST API for image analysis
- Multiple detection models
- Image perturbation algorithms
- API integrations
- Upload interface
- Folder/drive scanning
- Real-time analysis results
- Image protection tools
- Automatic image detection
- Real-time alerts
- Privacy protection warnings
cd backend
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
python download_models.pycd frontend
npm install
npm run dev- Open Chrome and go to
chrome://extensions/ - Enable "Developer mode"
- Click "Load unpacked"
- Select the
chrome-extensiondirectory
cd backend
python main.pyBackend runs on http://localhost:8000
cd frontend
npm run devFrontend runs on http://localhost:3000
POST /api/analyze- Analyze single imagePOST /api/analyze-url- Analyze image from URLPOST /api/analyze-folder- Scan entire folderPOST /api/protect-image- Add adversarial perturbationGET /api/health- Health check
Create .env file in backend directory:
OPENAI_API_KEY=your_key_here
GOOGLE_API_KEY=your_key_here
AIORNOT_API_KEY=your_key_here- NudeNet - NSFW detection
- DeepFake Detection Model - Face manipulation detection
- CLIP - Content understanding
- Custom CNN - Deepnude specific detection
- Adversarial Perturbation - Fawkes-style protection
- All processing happens locally by default
- API keys are optional for enhanced detection
- Images are not stored unless explicitly saved
- Privacy-first approach
MIT License - Use responsibly for legitimate safety and security purposes only.
This tool is designed for legitimate security research, content moderation, and personal protection. Misuse for harassment or malicious purposes is strictly prohibited.