Closed nehc0 closed 2 months ago
Thanks for your question. Maybe you can test with my visualization because I also cannot see the issue here.
Thanks for your question. Maybe you can test with my visualization because I also cannot see the issue here.
Did you crop image or apply distortion before visualization for allocentric views?
Distortion is not a main factor in allocentric view. My visualization script first loads the data and then apply the visualization. Maybe you can just try that data loader.
Distortion is not a main factor in allocentric view. My visualization script first loads the data and then apply the visualization. Maybe you can just try that data loader.
Solved. I made a stupid mistake of using 'neutral' rather than gendered smplx for all meshes. After I fixed it, the rendered results seemed much more acceptable:
[s01/box_use_02/6-fix]
[s01/espressomachine_grab_01/7-fix]
Hello. I use my custom visualization pipeline for allocentric views, transforming the SMPL-X mesh vertices from world coordinates to pixel assuming a pinhole camera model using the extrinsic and intrinsic matrices you provide in
meta/misc.json
. However, I observe that in some cases the projected meshes don't overlay well with the humans in images. Is it my problem?Some examples:
[s01/box_use_02/6]
[s01/espressomachine_grab_01/7]
[s08/capsulemachine_use_01/5]