zengwang430521 / DecoMR

Repository for the paper " 3D Human Mesh Regression with Dense Correspondence "
Apache License 2.0
168 stars 23 forks source link

Some questions about SMPL parameter of Human36m #19

Closed onepiece666 closed 3 years ago

onepiece666 commented 3 years ago

1、if i use the SMPL parameter generated by SMPLify-X. Should i use the 'trans' matrix genereted by SMPLify-X of SMPL to get the 'sampled_vertices' , or could i use SMPL(pose, betas, trans) to get the 'sampled_vertices' ? 2、And how do we get the camera coordinates from joints calaculated from 'sampled_vertices' generated by SMPL(). use 'self.smpl.get_train_joints(sampled_vertices)' ? but it never use 'J_regressor_h36m.npy'

I've just been in touch with this research direction, and I'm looking forward to your reply. Thank you

onepiece666 commented 3 years ago

image When I train the net at epoch8. i find some problems at knee and ankle. Is it my dataset problem because i use the smpl parameters geneterated by smplify-x (ignore 'trans') and use 'self.smpl.get_train_joints(sampled_vertices)' to get the 'sampled_joints_3d', and calaculate the loss between 'sampled_joints_3d' and 'gt_keypoints_3d' In camera coordinate system.

zengwang430521 commented 3 years ago

Hi, to get the 'gt_vertices', you do not need to use the 'trans' matrix. In our framework, the translation is realized by the camera parameters.

When calculating the loss on 3D joints, we use the coordinates relative to the pelvis. So you can omit the translation between SMPL model and camera coordinates.

According to your image, it seems that the left and right 2D joints of legs, knees and ankles are swapped, which caused the error.

onepiece666 commented 3 years ago

Thank you for your quick reply. I will try your suggestion. Thank you.