eth-siplab / AvatarPoser

Official Code for ECCV 2022 paper "AvatarPoser: Articulated Full-Body Pose Tracking from Sparse Motion Sensing"
https://siplab.org/projects/AvatarPoser
MIT License
277 stars 46 forks source link

Baseline implementations #3

Open sadegh-aa opened 1 year ago

sadegh-aa commented 1 year ago

Hi authors,

This is a great paper! Congratulations! Since you have put the effort in implementing the existing methods where the code is not publicly available, it would be a great benefit to the community if you could also share those implementations, allowing others to make (qualitative) comparisons to other baselines as well.

Do you have any plan to release at least the checkpoint, the model definition, and the generation script for other baselines?

Thanks

jiaxi-jiang commented 1 year ago

Hi, thanks for your interest in our work! For sure, we are still sorting out the code and will release the baseline models asap.

sadeghaa commented 1 year ago

Great! Looking forward to it then :)

goatchurchprime commented 1 year ago

The paper https://arxiv.org/pdf/2207.13784.pdf doesn't say which engine you have implemented these experiments in.

Have you looked at Godotengine for this? It has a full time VR core developer and is fully open source, so you can hook in and instrument your algorithms with more replicability than against a closed source behemoth.