Open LogicHuu opened 5 years ago
Thank you for your report. I just fixed it. I did not notice this at all because my face is always classified to "male" in bug version...
Are you sure about this? Atm it seems exactly inverted since 9417f048fff4bc2061fabb720c1c869d3c4b3b36 Example:
Thanks for the great repo by the way!
Because the training data`s label is opposite. So the result is wrong. You can train new model or change your data label.
@jjhbw I'm not sure...
Before that commit, I checked the (very noisy) IMDB training dataset, and I thought the description gender: 0 for female and 1 for male, NaN if unknown
is correct at that time.
(but demo.py results before correcting male/female flip seem to be correct...)
The training data label about gender is opposite. Pre-training-model`s gender is wrong, for example: male will be predicted to 1, and female will be predicted to 0. But the result should like this male is 0 and female is 1. Please check this bug. Check like this: change the gender label and and demo.py in line No.127.