PyTorch implementation of our ICCV 2019 paper: Liquid Warping GAN: A Unified Framework for Human Motion Imitation, Appearance Transfer and Novel View Synthesis
In demo_imitator.py, you transfer motions from mixamo smpls (e.g. ./assets/samples/refs/mixamo) to images. I guess these motions are obtained from Adobe Mixamo https://www.mixamo.com/#/?page=1&type=Motion%2CMotionPack. I notice there is a 3d camera parameters for each frame in the result.pkl file. Do you have any idea ehre this parameter come from?
Hi, thanks for you interesting work.
In demo_imitator.py, you transfer motions from mixamo smpls (e.g. ./assets/samples/refs/mixamo) to images. I guess these motions are obtained from Adobe Mixamo https://www.mixamo.com/#/?page=1&type=Motion%2CMotionPack. I notice there is a 3d camera parameters for each frame in the result.pkl file. Do you have any idea ehre this parameter come from?
Thanks in advance.