Closed LeCongThuong closed 4 weeks ago
The hand pose is part of the output of the network here. You can access it through the code out['pred_mano_params']['hand_pose']
and export/save it.
Sorry for replying late!
With your guide, I can extract the left and right hand pose under the rotation matrix format (15x3x3). However, I got another problem! Hope that you can help or give advice.
Context: I used the SMPLer-X work(link) to reconstruct the whole mesh. Howerver, the output of hands of the SMPLer-X model returns the bad result, and I found that your work (HaMeR) gave very good result (Figure 1). Therefore, I have the idea that: replace the hand mesh of SMPLer-X with your output of HaMer.
To do: To do that, I firstly converted the hand pose output of your work (Rotation matrix) to axis-angle format, and replace value of "left_hand_pose" and "right_hand_pose" of SMPLER-X with the new output. However, I got the unexpected/bad result (the left hand is so bad - Figure 2).
Hope that you can help or give advice.
I believe you need to take the parameters out['pred_mano_params']['hand_pose']
for the left hand only, convert them to the axis angle representation and then flip them using a function like this.
The reason is that HaMeR processes only right hands, so when we have a left hand as input, we flip the bounding box and treat it as a right hand instead. This means that the out['pred_mano_params']['hand_pose']
output corresponds to the "right hand" of the flipped image.
Thank you for replying!
Here is the .plk (output) file I extracted from your output model ([link])(https://github.com/geopavlakos/hamer/blob/dc19e5686198a7c3fc3938bff3951f238a85fd11/demo.py#L134). It contains hand_pose, global_orientation, beta, etc.
I followed your instruction: Firstly, I extracted the left hand only (shape 15x3x3), and then I converted them into the axis angle representation (shape 15x3), and finally flipped them using the following code (it will change the rows of left hand pose): `` hand_pose[1::3] *= -1
hand_pose[2::3] *= -1 `` I got the same bad result.
However, I do not think the flipping can solve the problem because you can see the two above pictures:
Do you have any code snippet that can recontruct the full hand mesh from the hand pose params to I can spot where the problem is?
Thank you again, and hope receive more advice from you!
Could you share the visualization of the hands after flipping the parameters of the left hand? The local hand pose of the left hand (i.e., pose of the fingers) should be the same with the local hand pose of the HaMeR result (which is not true now). With that being said, the exact location of the fingers might not be the same, because this depends on the wrist pose and wrist orientation which is defined by the estimated body pose parameters.
The vertices through MANO are estimated here.
The result after following your guide is the same as before (I don't know why!)(Figure 3, Figure 4 are the result with different perspectives)
For better inspection, let me describe my visualization process as follows:
from aitviewer import Viewer
from smplx import SMPLLayer, SMPLSequence
import numpy as np
v = Viewer()
npz_file_path = "PATH_TO_PREPARED_NPZ" results = dict(np.load(npz_file_path, allow_pickle=True))
body_pose = results["body_pose"].flatten().reshape((1, -1)) left_hand = results["left_hand_pose"].reshape(1, -1) # From HaMeR model right_hand = results["right_hand_pose"].reshape(1, -1) # From HaMeR model beta = results["betas"].reshape(1, -1) poses_jaw = results["jaw_pose"].reshape(1, -1) expression = results["expression"].reshape(1, -1) leye = results["leye_pose"].reshape(1, -1) reye = results["reye_pose"].reshape(1, -1)
smpl_layer = SMPLLayer(model_type="smplx", gender="female", poses_jaw=poses_jaw, poses_leye=leye, poses_reye=reye)
print(smpl_layer.bm.NUM_BODY_JOINTS, body_pose.shape)
smpl_seq = SMPLSequence(poses_body=body_pose, betas=beta, poses_left_hand=left_hand, poses_right_hand=right_hand, smpl_layer=smpl_layer)
v.scene.add(smpl_seq)
v.run()
Above is all about my process so far. I appreciate your support! Any advice or comments?
Thank you so much!
Hmm, after the flipping, the hand pose of the left hand should not look the same with before. Are you sure you are using the left hand pose parameters after flipping?
Thank for sharing your great work!
I have a question! I want to create a pkl to import to SMPLX Blender addon for SMPLX pose.
In the pkl, there are 2 values that I want to get from your work: "left_hand_pose", and "right_hand_pose".
At the present, when I run the demo.py. I only get the obj file of left and right hand. How to get the params of left_hand_pose, and right_hand_pose?