in Uncategorized

Not Safe for Work Detector using Tensorflow JS

From the official repository:

A simple JavaScript library to help you quickly identify unseemly images; all in the client’s browser. NSFWJS isn’t perfect, but it’s pretty accurate (~90% from our test set of 15,000 test images)… and it’s getting more accurate all the time.

The library categorizes image probabilities in the following 5 classes:

  • Drawing – safe for work drawings (including anime)
  • Hentai – hentai and pornographic drawings
  • Neutral – safe for work neutral images
  • Porn – pornographic images, sexual acts
  • Sexy – sexually explicit images, not pornography

The demo is a continuous deployment source – Give it a go: http://nsfwjs.com/

Write a Comment

Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.