c-he / NeMF

[NeurIPS 2022] Official implementation of "NeMF: Neural Motion Fields for Kinematic Animation"
MIT License
155 stars 9 forks source link

questions about the up axis #8

Open xiaoxiongweimi opened 1 year ago

xiaoxiongweimi commented 1 year ago

Sorry, I have a question. Why the up axis of smpl skeleton is z up. I print out the offsets of smpl skeleton, it should be y up. Is there something you considered and I missed that information? image

c-he commented 1 year ago

Hi, Though the skeleton is y up, the AMASS dataset we use targets z up. If you use its Blender addon, you can see they have a special rotation applied to adjust the orientation for AMASS data (https://gitlab.tuebingen.mpg.de/jtesch/smplx_blender_addon/-/blob/master/__init__.py#L1116).

xiaoxiongweimi commented 1 year ago

Thank your for your reply. If I only want to train the generative model (not including the gmp model), maybe I should set the 'output_trans' as 'True' and the corresponding 'global_output' as 18, right? image image

Hi, Though the skeleton is y up, the AMASS dataset we use targets z up. If you use its Blender addon, you can see they have a special rotation applied to adjust the orientation for AMASS data (https://gitlab.tuebingen.mpg.de/jtesch/smplx_blender_addon/-/blob/master/__init__.py#L1116).

c-he commented 1 year ago

Yes, also make sure to leave pretrained_gmp blank.

xiaoxiongweimi commented 1 year ago

I trained the model on our dance data and can see a gradual reduce of losses except the trans loss. The curve is very abnormal. According to the scripts, the trans loss is calculated via output 'root velocity' and 'root height' rather than the direct output 'trans', so can this curve be a possible case? image

Yes, also make sure to leave pretrained_gmp blank.

c-he commented 1 year ago

It looks a bit weird. You said you trained the model on your dance data, could you make sure these data are y-up or z-up?

xiaoxiongweimi commented 1 year ago

Our data are y-up, and I modified the config in the yaml file. I think there must be something wrong when I preprocessed our data. Sigh~~~

It looks a bit weird. You said you trained the model on your dance data, could you make sure these data are y-up or z-up?

xiaoxiongweimi commented 1 year ago

It looks a bit weird. You said you trained the model on your dance data, could you make sure these data are y-up or z-up?

I solved the problem about trans loss, it is about the parameter dimension when I call "estimate_angular_velocity" and "estimate_linear_velocity", and then could you tell me the difference between gmp datasets and generative datasets. It seems there are links between them but not exactly the same. I currently use the same datasets when I train gmp and generative, but the result goes too far away when gmp model predict the 'trans'.

c-he commented 1 year ago

Glad to know you solved your problem! Yes the processing steps for gmp and generative datasets are a bit different. When processing AMASS data for our generative setup, we used the configuration provided in https://github.com/c-he/NeMF/blob/main/configs/amass.yaml. While processing AMASS data to train our gmp, we set unified_orientation to False since we need the global orientation info to correctly predict global translations. Therefore, when combining these 2 trained models, you can see we insert the root orientation to obtain joint rotations and positions, and then compute other necessary gmp input data based on them. https://github.com/c-he/NeMF/blob/146a1eade5dd7eb77db8380c7f03adf99bfb09a2/src/nemf/generative.py#L233-L251

xiaoxiongweimi commented 1 year ago

Glad to know you solved your problem! Yes the processing steps for gmp and generative datasets are a bit different. When processing AMASS data for our generative setup, we used the configuration provided in https://github.com/c-he/NeMF/blob/main/configs/amass.yaml. While processing AMASS data to train our gmp, we set unified_orientation to False since we need the global orientation info to correctly predict global translations. Therefore, when combining these 2 trained models, you can see we insert the root orientation to obtain joint rotations and positions, and then compute other necessary gmp input data based on them.

https://github.com/c-he/NeMF/blob/146a1eade5dd7eb77db8380c7f03adf99bfb09a2/src/nemf/generative.py#L233-L251

Thank you~ And when combining these two models, is it more reasonable to denormalize the generative output first and then compute gmp input based on denormalized data because when gmp predicting, it will do normalization just like when it is trained. image

c-he commented 1 year ago

In our setup, we only normalize the input data and let our network output unnormalized results. Therefore, it's unnecessary to perform denormalization when combining these two models.