Not Safe for Work Detector using Tensorflow JS
2019 Apr 23A simple JavaScript library to help you quickly identify unseemly images; all in the client’s browser. NSFWJS isn’t perfect, but it’s pretty accurate (~90% from our test set of 15,000 test images)… and it’s getting more accurate all the time.
The library categorizes image probabilities in the following 5 classes:
_Drawing_- safe for work drawings (including anime)_Hentai_- hentai and pornographic drawings_Neutral_- safe for work neutral images_Porn_- pornographic images, sexual acts_Sexy_- sexually explicit images, not pornography
The demo is a continuous deployment source - Give it a go: http://nsfwjs.com/
