infinitered / nsfwjs

NSFW detection on the client-side via TensorFlow.js
https://nsfwjs.com/
MIT License
8.05k stars 537 forks source link

Confused with the models - false positives #324

Closed ghnp5 closed 1 year ago

ghnp5 commented 4 years ago

Hello,

I'm using these model files: https://github.com/infinitered/nsfwjs/tree/master/example/nsfw_demo/public/model ( I think it's the same as https://s3.amazonaws.com/nsfwdetector/min_nsfwjs.zip )

and I'm having several false positives.

100% non-adult face pictures get flagged as 90%+ porn

I tested also with the non-min model: https://s3.amazonaws.com/nsfwdetector/nsfwjs.zip

But, from the few examples I tried, it didn't change much at all, and some were even worse.

I see that in this issue: https://github.com/infinitered/nsfwjs/issues/276 You added some new "DAG 93" model.

Is this one supposed to have better accuracy?

What model is the best, and what should I use for best accuracy, please?

What is the difference between using the model provided in README (first link above), and using the DAG one with { type: "graph" } ?

Are there pros and cons?

Thank you very much.

GantMan commented 4 years ago

The new one is supposed to have better accuracy.

Use the DAG 93. Can you provide some false positives here? I can elaborate on the pros/cons, but I want to see if it fixes your issue first.

And sorry, I think that s3 link is outdated.

ghnp5 commented 4 years ago

Hello, Thank you.

I've just updated to the latest version of NSFWJS, and then loaded the DAG 93 model.

Tested against a number of false positives, and the "porn" % decreased to much more acceptable values for sure.

However, I tested a few images that are 100% porn and they would give 99% on the older model, but with the DAG93 they give 72.95% for one, 56.37% for another and 34.13% for the other, which doesn't make sense. There's a 4th image that I tested that it's not porn, but a semi-naked lady, and that one gave 89.48%.

With my examples, the new model just doesn't seem accurate at all.

Happy to give you examples, please give me your email so I can send privately to you.

GantMan commented 4 years ago

Our latest model was provided by @TechnikEmpire so I'm tagging em here.

My email is Gant and the domain is infinite.red

TechnikEmpire commented 4 years ago

NSFW Post!!!

My post is NSFW because I'm going to use some real talk here to get to the nitty gritty details of what I did differently.

Alright so with the very latest model that I trained, there was a significant change.

Changes to Training Data

I used the same training data that @GantMan used in older models, but I decided to clean it up. What I did was I loaded Yahoo Open NSFW model and I deleted all images from the porn category that had a classification score of 10% or less.

I then used Yahoo model to move all images out of the sexy category into the porn folder that had a score of 60% or more.

This translated into several thousand files for each category. There were thousands of files in the sexy category that were full blown pornography. Images of sexual penetration, just plain snapshots of porn movies, full body nudity, close up of genitalia. That's not sexy, that's porn.

There were probably a couple thousand images in the porn folder of bukakke. As humans we recognize what this is, but really it's just a closeup of a face. To a neural network it's just a face, and Yahoo's Open NSFW model confirmed this by classifying them as non-pornographic. Those images were blown away, so I'm really surprised that plain faces are coming up as porn.

There were also a couple thousand images of extremely obscure pornography. I took a peek and some of them I was like "this is a completely benign image", then I'd see a 3x3 pixel area of a penis hanging out of someone's pants. This is going to throw off the neural network, at least for training purposes. Those images were blown away.

The sexy category should now only really trigger when there's provocative images. i.e. someone rolling around on a boat in a thong and bra. Nudity should come up in the pornography category.

Remarks About Changing Accuracy

With regards to the scores for categories changing, this isn't necessarily a bad thing. The way this model is to be used is that the highest scored class wins, and that's how neural networks like this are scored as well. Top-1 and Top-5 accuracy. Even if a porn image is split up like so:

Porn: 21%
Sexy: 20%
Neutral: 20%
Drawing: 20%
Hentai: 19%

The model is still accurate. It's not even necessarily indicative of a problem in the neural network.

Conclusion

What I believe happened here is twofold. First, I believe I've overfit the model by over-training it. I didn't notice this before, but your remarks made me take a second glance and it seems to be a bit overfit.

If you look at the posted Tensorflow output on this issue, you can see that train loss is decreasing while validation loss is increasing in the final train iteration.

Second, one-byte quantization would exacerbate this issue.

I think I'll run a new training session and get a new model published with one less iteration because it seems it was that last iteration that pushed the model over the edge.

Thanks for bringing this up!

TechnikEmpire commented 4 years ago

Oh yeah I forgot, I also started the training session with training just the final softmax layer, then fine tuning for 5 consecutive sessions, then 2 more iterations with a highly reduced LR. I'll just train it the way I did the previous model and report back.

ghnp5 commented 4 years ago

@GantMan - just sent the email. Please let me know if you didn't receive.

Thanks @TechnikEmpire - please keep me updated!

TechnikEmpire commented 4 years ago

I've started training so sometime this evening I'll do a PR on the model repo.

GantMan commented 4 years ago

I got the email @ghnp5 thanks so much.

@TechnikEmpire - thanks for checking for overfitting! I'll look forward to your update.

ghnp5 commented 4 years ago

Hey - any news? :) Thanks!

TechnikEmpire commented 4 years ago

Yeah I've retrained, I'll check your submissions against the new model before I publish.

I also got side tracked cause I'm not happy with this 93% ceiling fine tuning so I'm training mobilenet v3 large from scratch.

peerwaya commented 4 years ago

Great work guys!!!

Porn: 21%
Sexy: 20%
Neutral: 20%
Drawing: 20%
Hentai: 19%

I am fairly new at this and I don't know what a good confidence score for each of the categories is. I have used the saved_model.tflite found https://github.com/GantMan/nsfw_model/releases/tag/1.1.0 on Android/IOS with impressive results. However, I still don't know how to choose a good confidence score.

ghnp5 commented 4 years ago

Hello,

Any news about this? :)

Thanks!

TechnikEmpire commented 4 years ago

@ghnp5 Yeah sorry for going MIA, I got tied up with a bunch of stuff. I have trained new models, I'm just coordinating reviewing your submission against them.

TechnikEmpire commented 4 years ago

@ghnp5 If there's an urgency, you can simply revert back to this model:

https://github.com/GantMan/nsfw_model/releases/download/1.1.0/nsfw_mobilenet_v2_140_224.zip

GantMan commented 4 years ago

I'm super excited to see your new model @TechnikEmpire ! Lots of people use NSFWJS and I'd love for your advanced model to be the de facto standard.

TechnikEmpire commented 4 years ago

Great work guys!!!

Porn: 21%
Sexy: 20%
Neutral: 20%
Drawing: 20%
Hentai: 19%

I am fairly new at this and I don't know what a good confidence score for each of the categories is. I have used the saved_model.tflite found https://github.com/GantMan/nsfw_model/releases/tag/1.1.0 on Android/IOS with impressive results. However, I still don't know how to choose a good confidence score.

Those scores were just an example. You just take the highest valued class and accept that. Don't get into thresholding the values. Simply take the neural network's prediction at face value.

TechnikEmpire commented 4 years ago

Here are the confidences on a new model for your submission @ghnp5 :

For unsafe submissions:

BAD:  F: 80.39.jpg Confidence: 0.95885
BAD:  F: 93.20.jpg Confidence: 0.998286
BAD:  F: 93.79.jpg Confidence: 0.993065
BAD:  F: 98.26.jpg Confidence: 0.671015
BAD:  F: 99.07.jpg Confidence: 0.718538
BAD:  F: 99.23.jpg Confidence: 0.998276
BAD:  F: 99.24.jpg Confidence: 0.596281

For safe submissions:

BAD:  F: 70.99.jpg Confidence: 0.428323
GOOD:  F: 96.52.jpg Confidence: 0.908221

As you know, the file 70.99.jpg is black and white. This will throw off any neural network, which is why colorization before inference is an extensive topic with various techniques.

Here's what happens to the scoring when I colorize that image:

BAD:  F: 70.99.jpg Class: 1 Confidence: 0.428323
GOOD:  F: 96.52.jpg Class: 2 Confidence: 0.908221
GOOD:  F: 70.99-Color.png Class: 2 Confidence: 0.827742

For future reference, it's very difficult to gauge any network with a few images. This is why we split out 10% or more of our total data set for validation. According to that split ratio (against tens of thousands of images), Tensorflow tells us that the latest model is ~92% accurate. While this is a really good accuracy, note that 8% between where we're at and perfection (100%) is quite the chasm, so you're definitely going to see false positives and false negatives.

I'll be uploading the newly trained model for @GantMan to my attached PR here:

https://github.com/GantMan/nsfw_model/pull/60

I just need to convert it to tfjs first.

TechnikEmpire commented 4 years ago

New model is attached to that linked PR. For greater clarity, my experimentation (the scores given above) are based on a FP16 version run through OpenCV::DNN, not the web or quantized web versions of the model.

Acorn221 commented 4 years ago

Hi, I'm fairly new to neural nets and this library, I tried using your model and I keep on getting the same error "nsfwjs.min.js:1 Uncaught (in promise) Error: layer: Improper config format:". Do you know what I'm doing wrong? I have tried loading it without the {type: "graph"} parameter and the other models have worked, Thanks! Here's how I'm trying to load the model:

this.nsfwjs.load(this.modelURL, {type: "graph"})

EDIT: I've just realised I was using an outdated version of nsfwJS, Thanks for the model!

evdama commented 3 years ago

I'm also new to the entire subject of preventing nsfw images uploaded by users entering my app without any sort of check for porn or violence. Now, I've read through the blog post here https://shift.infinite.red/avoid-nightmares-nsfw-js-ab7b176978b1 this github readme page as well as https://github.com/GantMan/nsfw_model

However, I'm still in the fog on what model file is right for me and where to get it from? What's the difference among the various model files eg normal vs graph? Could someone maybe add two or three lines to the readme page because I reckon that's something people new to the subject are struggling to understand in general?! 😃

ghnp5 commented 3 years ago

@evdama I'm still using this one:

https://github.com/infinitered/nsfwjs/tree/master/example/nsfw_demo/public/model

In my experience, this one works the best.

The thing is that you cannot 100% rely on any of the models. What we do is that if it detects 70%+, we show a popup to the user and ask them to confirm that this is not inappropriate, or otherwise don't proceed.

evdama commented 3 years ago

ha! I was just afk making a tasty ☕️ and already got two great answeres 👍 (those guys must be in lockdown as well so... 😂)

@TechnikEmpire Please, like you'd speak to a very motivated but unexperienced puppy, where do I get those files and how do I use them? Is it the entire .zip I found or just one of contained files or the entire collection of shards 🤔 ?

And once I have the right file/model, I'd just put it inside my Sapper's public/ folder and reference it with the .load() right?

@GantMan Can you add a line or two to the readme so that puppies know what to do with regards to model files (and which one is the right one for a certain use case e.g. images vs videos)? 🤓

ghnp5 commented 3 years ago

Someone answered here, but seems the reply is gone. They were saying that the model I'm using is very outdated and that there's a new one with 98% accuracy.

Where can I find it?

The README of this repository points to the model I'm using, in the "Host your own model" section: https://github.com/infinitered/nsfwjs#host-your-own-model

evdama commented 3 years ago

Someone answered here, but seems the reply is gone. They were saying that the model I'm using is very outdated and that there's a new one with 98% accuracy.

Yup, it was @TechnikEmpire so I assume he'll come back with an even better answer...

Acorn221 commented 3 years ago

@evdama I'm still using this one:

https://github.com/infinitered/nsfwjs/tree/master/example/nsfw_demo/public/model

In my experience, this one works the best.

The thing is that you cannot 100% rely on any of the models. What we do is that if it detects 70%+, we show a popup to the user and ask them to confirm that this is not inappropriate, or otherwise don't proceed.

I use this model in my chrome extension, from what I remember it was the best however it looks like it's trained on professional porn and not more home-made as where my chrome extension is made for omegle, there are many people with bad lighting or blurry cameras and it doesn't detect them very well sadly

ghnp5 commented 3 years ago

The site at https://nsfwjs.com/ uses the 93% accuracy model, which I'm pretty sure I tried in the past and got worse results than I get with the model I'm currently using.

TechnikEmpire commented 3 years ago

I posted something and then thought better of it. I am the author of a closed-source program that uses such models and there are dirtbags that follow me on github and steal my oss so I'm doubly inclined not to help anyone.

However, I was the guy making big overhauls to this repo and for some reason @GantMan stopped merging my pr's.

https://github.com/TechnikEmpire/nsfw_model

Has stats for every kind of model I trained. Dunno if my site links are gone or not. But basically you just need to manually clean up the categories and the use the new tf2 api that leans on tfhub that I integrated and you'll hit much better numbers.

TechnikEmpire commented 3 years ago

Sorry not "this" repo, the model repo that drives this project.