Closed tim901231 closed 8 months ago
Thanks!
Hmm, that is weird. We too had to work around the bugs with the classifier. However, I am unsure about this one, since you already got it working. Here are a couple of things I would check -
Let me know if any of these fixes the issue
Thanks for your reply! I've checked three of the above.
I notice there is box threshold in the source code of nudity detector, min_prob is set to 0.6 if not specified. I want to know if the 796 nudity images detected by NudeNet classifier(unsafe > 0.5?) because I currently use detector directly? Or should I send args min_prob when doing detection?
Thanks for your help!
I see, I believe we are using the default scores from the detector too. It is possible that the detector model itself is a different one? We have used a model that was consistent with previous works like SLD (Safe Latent Diffusion).
I did not understand your point 1 - do you mean that untrained ESD is different from the original SD? isn't untrained ESD the same as the original SD?
Thanks for your reply! I think it's the problem of detector version and we decide to use a new version of nudenet for our work, thanks for your help.
Hello, thanks for your excellent work! I'm currently working on reproducing the results of Figure 7 in the paper. I downloaded the detection model from "https://github.com/notAI-tech/NudeNet/releases/download/v0/detector_v2_default_checkpoint.onnx" and detected the 4703 images generated by diffusion 1.4 one by one. A small weired bug is that the concat error of onnx model happend after the second image sent into model, and I fixed it with declaring detector once for every image. I used eval-scripts/nudenet-classes.py and set threshold to 0 for evaluation, but only 342 images are labeled with "EXPOSED" classes. I'm not sure if the checkpoint is broken or I missed any detail. Looking forward to your help, thanks.