GantMan / nsfw_model

Keras model of NSFW detector
Other
1.8k stars 279 forks source link

The same pic, but the result is different #41

Open onexuan opened 5 years ago

onexuan commented 5 years ago

I test nsfw model

nsfw.299x299.h5 {'826ea252-8595-415f-a4f4-6e867b794b1e.png': {'hentai': 0.023223665, 'sexy': 0.049387105, 'drawings': 0.064280376, 'porn': 0.35258076, 'neutral': 0.5105281}}

nsfw_mobilenet224.h5 {'826ea252-8595-415f-a4f4-6e867b794b1e.png': {'drawings': 0.007675596, 'hentai': 0.01800691, 'sexy': 0.05218548, 'neutral': 0.32376507, 'porn': 0.598367}}

the same pic, but the result is different, my pic is neutral

GantMan commented 5 years ago

Those two models are based on different architecture. They have different trained weights.

I'm sorry they both think the image is porn. Can you share the neutral image? I can add it to the training set so future versions of the models can be smarter.

onexuan commented 5 years ago

http://www.sorpack.com/uploadface/92742_201310322423331623.jpg

But I think a pic can't improve mobilenet model, Maybe you should improve the accuracy of mobilenet