Closed Aman-Goel1 closed 1 year ago
We just double the number of epochs. You can find it here: https://github.com/c-he/NeMF/blob/146a1eade5dd7eb77db8380c7f03adf99bfb09a2/src/train_basic.py#L43
Thanks a lot for the prompt reply!
Hey, I had another question and didn't want to open another issue. With each sequence length beyond 32, say 64, do you also double the frame rate for the calculation of dt? Also how is step size being used during training and testing. I was unclear on these parameters.
Hi, we fixed the frame rate as 30
during training with different sequence lengths. It will be changed during testing to validate the smoothness of our result, since we learned a continuous representation. Step size is then the parameter used to change the frame rate, where step size 1
equals the original frame rate 30
, and step size 0.5
equals the frame rate 30/0.5=60
, etc. You can find the corresponding code here:
Hi, thanks for answering these questions! I was wondering if you had tried training on different frame rates also during training to provide a better continuous representation during testing.
No we didn't try this experiment. Since we pre-normalized all sequences to 30 fps, we don't have ground truth data of different frame rates for training. But if you don't pre-process AMASS data, there do exist sequences with different frame rates, and you can try to see if training on these data gives you better results.
Hi, thanks for the detailed answers and I might check it out for better up-sampling representations in the future.
Hi, Thanks for the amazing work! I was wondering how many epochs per sequence length you had trained for each sequence length. "We train our single-motion NeMF for 500 iterations to fit a 32-frame sequence, and scale the number of iterations proportionally as the sequence length increases to make sure that our model is sufficiently trained for each length of sequences.". In the supplementary material for NeMF, from my understanding it was written that you had trained for 500 epochs on sequence length of 32 and for each higher sequence length, a proportional number of epochs were taken. What proportional number of epochs were taken in this case?