eth-siplab / AvatarPoser

Official Code for ECCV 2022 paper "AvatarPoser: Articulated Full-Body Pose Tracking from Sparse Motion Sensing"
https://siplab.org/projects/AvatarPoser
MIT License
277 stars 46 forks source link

Test on a Commerical VR System #20

Open Combo1 opened 1 year ago

Combo1 commented 1 year ago

Hello,

fantastic work you are doing! I have a question regarding the live recordings on the HTC Vive HMD and the following pose prediction.

So you trained a Transformer network + MLP and implemented a IK module and FK module. To get a pose from your Transformer network you need a vector in your latent space. How did you generate from the HMD/controllers the vector, such that you were able to generate a pose? And is there any file in your project, such that I can try to replicate your results?

Thanks!

Recialhot commented 6 months ago

hello,I would like to ask if you have a solution now, thank you