av-savchenko / face-emotion-recognition

Efficient face emotion recognition in photos and videos
Apache License 2.0
654 stars 124 forks source link

Provide the validation script/notebook. #7

Closed ltkhang closed 2 years ago

ltkhang commented 2 years ago

Hi,

I am fond of your works and paper, but I can not find any validation script to validate your result, especially the highest result with efficientNetB2-8 classes-EffectNet.

Or could you please provide a separate script to pre-process the input images then we can validate the provided weights on your GitHub repository?

Thank you,

av-savchenko commented 2 years ago

Hello! The validation is done inside the notebook https://github.com/HSE-asavchenko/face-emotion-recognition/blob/main/src/train_emotions-pytorch.ipynb. You do not need to run all cells, just verify that PATh is equal to '../models/affectnet_emotions/enet_b2_8.pt', run cell to load the model and test it on the validation part of AffectNet. The preprocessing is implemented in the cell started with

print(test_dir)
y_val,y_scores_val=[],[]
model.eval()

In fact, the saved notebook contains the results for the best EfficientNet-B2 model. I hope you will get the same results.

ltkhang commented 2 years ago

I checked your notebook before asking but the output of your notebook here (https://github.com/HSE-asavchenko/face-emotion-recognition/blob/main/src/train_emotions.ipynb) is kinda messy and I am not sure if I could find the result of EfficientNet-B2 there.

av-savchenko commented 2 years ago

train_emotions.ipynb stands for initial preparation of the AffectNet dataset and training of TensorFlow models. You need to take a look at train_emotions-pytorch.ipynb. You could easily find the number 63.025 in the output of one cell there

ltkhang commented 2 years ago

I saw it, thank you.