NSFWJS

Indecent content checker with TensorFlow.js

EstablishedOpen SourceLow lock-in

Pricing

See website

Flat rate

Adoption

Stable

License

Open Source

Data freshness

Overview

What is NSFWJS?

NSFWJS is a tool that uses TensorFlow.js to detect and filter out inappropriate or indecent content in images. It helps developers ensure their applications remain family-friendly by providing real-time image analysis.

Key differentiator

NSFWJS stands out as a lightweight, client-side solution for filtering inappropriate images using TensorFlow.js, making it ideal for web applications that need real-time analysis without the overhead of server-side processing.

Capability profile

Strength Radar

Real-time image …Detects inapprop…Can be integrate…

Honest assessment

Strengths & Weaknesses

↑ Strengths

Real-time image analysis using TensorFlow.js

Detects inappropriate content in images

Can be integrated into web applications for on-the-fly filtering

Fit analysis

Who is it for?

✓ Best for

Web developers who need to filter user-generated images for inappropriate content

Developers building social media platforms where community guidelines are critical

Teams working on applications that require real-time image analysis and filtering

✕ Not a fit for

Projects requiring high accuracy in detecting specific types of indecent content beyond general categories

Applications needing server-side processing for large volumes of images due to local execution limitations

Cost structure

Pricing

Free Tier

None

Starts at

See website

Model

Flat rate

Enterprise

None

Performance benchmarks

How Fast Is It?

Next step

Get Started with NSFWJS

Step-by-step setup guide with code examples and common gotchas.

View Setup Guide →