wagner-niklas / CAGE_expression_inference

Project to infere emotional expressions and benchmark datasets by Niklas Wagner, Felix Mätzler, Samed R. Vossberg, Helen Schneider and Svetlana Pavlitska.
MIT License
11 stars 2 forks source link

Video inference script #2

Closed Varun-GP closed 4 months ago

Varun-GP commented 4 months ago

Thanks a lot for this wonderful work! I wanted to test the model on new videos/images. Can you please provide any live video expression inference?

wagner-niklas commented 4 months ago

Hi @Varun-GP, I uploaded a Script to use a trained model (e.g. AffectNet8Combined) with your webcam, called "inference_on_webcam.py". If you want to use the model on videos/images, just apply the model directly to an image or to each video frame instead of passing the webcam to it.

Keep in mind to change the model-path, your torch-device (cuda, mps, ...) and the number of output neurons based on the used model (e.g. the Affectnet8Combined model has 10 output neurons, 8 for the discrete classes Angry, Sad etc. and two neurons for Valence and Arousal).

Best regards, Niklas

Varun-GP commented 4 months ago

Thanks a ton!