Closed AmaiaBiomedicalEngineer closed 2 years ago
Hello! Please, take a look at the section "Multi-task: FER+Valence-Arousal" of train_emotions-pytorch.ipynb. Starting from line PATH='../models/affectnet_emotions/enet_b0_8_va_mtl.pt', you could load the model and run it. First 8 outputs correspond to logits for facial expressions, and the last two outputs stand for valence and arousal. The metrics on validation set of AffectNet are computed in the last two lines of this section right before Example usage. BTW, in this example, you could see the predicted valence and arousal in the titles of photos of my children. But I should say that my estimates of valence and arousal are not the best-of-the-best, I just used them for multi-task learning and improvement of facial embeddings learned by the model.
Closing due to inactivity
Hello again! I've read your paper and I've seen that you use the circumplex model's variables arousal and valence. How do those variable appears in the code? I can't find them :( Thank you, Amaia