UttaranB127 / speech2affective_gestures

This is the official implementation of the paper "Speech2AffectiveGestures: Synthesizing Co-Speech Gestures with Generative Adversarial Affective Expression Learning".
https://gamma.umd.edu/s2ag/
MIT License
44 stars 9 forks source link

Running on a custom dataset #22

Open harshita19244 opened 1 year ago

harshita19244 commented 1 year ago

Hi, How can I run the script on a custom dataset? I have been going through the code, but many things look like they are tailored to the Ted Dataset. If you could point me to some code, I would be grateful.

UttaranB127 commented 1 year ago

Hi, Yes, you can train and test on a custom dataset. You need to either match the pose representation that the network expects (10 upper-body joints) or change the network parameters to fit the pose representation that you have. You can start by looking into this script: https://github.com/UttaranB127/speech2affective_gestures/blob/master/utils/ted_db_utils.py, which takes in the pose representation. If you're planning to use a different pose representation, you would also need to change the network parameters in this script: https://github.com/UttaranB127/speech2affective_gestures/blob/master/net/multimodal_context_net_v2.py. In either case, you may need to play around with the hyperparams if you're re-training the network on a different dataset.