hongsukchoi / Pose2Mesh_RELEASE

Official Pytorch implementation of "Pose2Mesh: Graph Convolutional Network for 3D Human Pose and Mesh Recovery from a 2D Human Pose", ECCV 2020
MIT License
665 stars 69 forks source link

How can I get the *.npy file for the demo #41

Closed AndyVerne closed 2 years ago

AndyVerne commented 2 years ago

Hello, much appreciate your amazing work!

I got one question here, how can I generate the *.npy file for the demo file to generate the mesh *.obj file.

What's more, for example, I have the 3d poses presented by three dimensions Cartesian coordinate system(xyz-axis). like the image below. How can I convert the original 3d pose to the *.npy file?

image

hongsukchoi commented 2 years ago

Hi!

I am not sure which would be a perfect answer for you, but here are 3 suggestions.

1) To get input 2d pose for Pose2Mesh, use off-the-shelve 2D pose estimators like this.

2) You can save your own 2d/3d pose as .npy file by

np.save('file_name.npy', pose) 

3) Make sure the that the input pose (joints) follow the topology here.

AndyVerne commented 2 years ago

@hongsukchoi Thanks for the suggestions. I will try the suggestions later.

BTW, the demo.py actually supports 2D skeleton pose? After I loaded the h36m_joint_input.npy, I found that is a shape of (17, 2) instead of (17, 3).

[EDITED]: Sorry that bothered you, just have seen the comment 'path of input 2D pose' in the demo/run.py