Not Safe for Work Detector using Tensorflow JS

From the official repository:

A simple JavaScript library to help you quickly identify unseemly images; all in the client’s browser. NSFWJS isn’t perfect, but it’s pretty accurate (~90% from our test set of 15,000 test images)… and it’s getting more accurate all the time.

The library categorizes image probabilities in the following 5 classes:

  • _Drawing_ - safe for work drawings (including anime)
  • _Hentai_ - hentai and pornographic drawings
  • _Neutral_ - safe for work neutral images
  • _Porn_ - pornographic images, sexual acts
  • _Sexy_ - sexually explicit images, not pornography

The demo is a continuous deployment source - Give it a go: