sanweiliti / RoHM

The official PyTorch code for RoHM: Robust Human Motion Reconstruction via Diffusion.
https://sanweiliti.github.io/ROHM/ROHM.html
Other
286 stars 11 forks source link

Sharing initial motions #16

Closed enesduran closed 1 week ago

enesduran commented 4 weeks ago

Than you for the great work!

When are you intending to share initial motions from regressors? The LEMO-based initial motion data requires permission for access, I sent a request but could not get a response.

Thanks

enesduran commented 4 weeks ago

Thank you for updating initial motions. When I run the following comment python test_prox_egobody.py --config=cfg_files/test_cfg/prox_rgb.yaml --recording_name=RECORDING_NAME for evaluation with your initial motion values I get the following results. The keypoints seems fine but initial motion is off, maybe it needs a global alignment. Is it a problem on my end? Could you check?

Best

s001_frame_00172__00 00 05 677 s001_frame_00172__00 00 05 677

sanweiliti commented 4 weeks ago

Thanks for the feedback, I will look into it. Could you specify which sequence did you test on?

enesduran commented 4 weeks ago

it was MPH1Library_00034_01

sanweiliti commented 3 weeks ago

Hi, I tested it on my side and the results seem to be correct. Not sure what caused the case in your image, for debugging purpose, I would suggest to visualize the init motions (it's in the same format as the original PROX seudo GT sequences) with original PROX dataset repo visualization script to see if they align with the image.

enesduran commented 3 weeks ago

Okay, thank you, I will try to visualize init motions with PROX vis code. One thing directly comes to my mind is: I used EGL backend for pyrender. Could it be the reason?

sanweiliti commented 2 weeks ago

hmm I'm not quite sure, you can disable the rendering and use the flag 'visualize' to visualize via Open3D, to see if the mesh/skeleton looks okay from the camera view.