Closed zhou595 closed 4 years ago
The 118 joints follow the openpose ordering. https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/doc/output.md#keypoint-ordering-in-cpython. This code contains the order in which the joints are being read: https://github.com/vchoutas/smplify-x/blob/master/smplifyx/data_parser.py.
Thanks @Anirudh257 for pointing to the code. I will close this now.
The 118 joints follow the openpose ordering. https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/doc/output.md#keypoint-ordering-in-cpython. This code contains the order in which the joints are being read: https://github.com/vchoutas/smplify-x/blob/master/smplifyx/data_parser.py. Hello I can not find https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/doc/output.md#keypoint-ordering-in-cpython. Were you able to extract the 3D joint coordinates and their corresponding names(Anirudh257 )?
@mary-arch They seem to have removed the ordering. You can check https://github.com/vchoutas/expose/blob/master/expose/data/targets/keypoints.py and look at the ordering given for OPENPOSE variable.
These are the images showing the joint ordering:
25 Body joints
21 Hand joints
70 Face joints
thanks a lot. How call keypoints.py?
@mary-arch you have to look into the Expose repository for that.
@Anirudh257 Hello. based on mesh of body (<trimesh.Trimesh(vertices.shape=(10475, 3), faces.shape=(20908, 3))>) how extract the Coordinates of lip or eye or nose ...?
@mary-arch Don't use those. Use the 118 joints coordinates that you get from the model. Use the keypoints.py from Expose to get the ordering of lips/eye/nose.
@Anirudh257 ٍHello, Excuse me . Do you mean here?https://github.com/vchoutas/expose/tree/e4de58f529c79e7581f211056168548d3b044509 It needs pytorch>1.6.0 to run and this version of pytorch needs cuda>7.5, but I can't because I have cuda with version7.5. Is there another way or can you help me? Because I could do the SMPLify-X with the pytorch0.4.1 but cant run expose with this pytoch
Yes, @mary-arch That is the correct repository. You don't have to run the expose code. From https://github.com/vchoutas/expose/blob/e4de58f529c79e7581f211056168548d3b044509/expose/data/targets/keypoints.py#L257
you can get the exact index of the key points belonging to lips, eyes, etc. From https://github.com/vchoutas/smplify-x/blob/a7876f03aa086c1fa010941a91482d7cb240e7d9/smplifyx/data_parser.py#L89
you can see that face are the joints [67 - 118] in SMPLX. Let me know if you need any more help.
@Anirudh257 I'm sorry to bother you so much. 1- As I calculated, the face joints are started from index of 55 but you told me 67. I am wrong? 2- The values obtained from the keypoints of body parts are as follows
[4.64578e+02 2.96835e+02 9.68372e-01] [4.62146e+02 2.95889e+02 8.92533e-01] [4.60659e+02 2.93727e+02 8.00971e-01] [4.64038e+02 2.93456e+02 7.69484e-01] [4.65795e+02 2.93456e+02 8.06186e-01] [4.67146e+02 2.92916e+02 7.54968e-01] [4.69579e+02 2.92646e+02 6.38653e-01] [4.67281e+02 2.93997e+02 7.92380e-01] [4.65930e+02 2.94538e+02 8.98834e-01] [4.64308e+02 2.94673e+02 8.06689e-01]] That is, the first index is about 400, the second index is about 200, and the lower index is, for example, less than one. But in the obtained mesh all values are below one. Why?
@mary-arch, No problem, let me repeat again. The first step is to follow:
model_output = body_model(return_verts=True, body_pose=body_pose)
joints = model_output.joints.detach().cpu().numpy().squeeze()
This will give a 118 joint tensor which follows the ordering of Openpose as I had posted above. From https://github.com/vchoutas/smplify-x/blob/a7876f03aa086c1fa010941a91482d7cb240e7d9/smplifyx/data_parser.py#L89 you can find the ordering that the body joints are first, followed by the left-hand joints, right-hand joints and then the face-joints.
2) From the images of the joints that I had posted above, you can find that the ordering is Body(25), Left hand(21), Right Hand(21) joints and then remaining 51 are the face joints. From these 41
3) These 3D points are signifying the 3D distance in m and are not the same as the mesh values. If you want the 3D coordinates, this is the right way, Mesh is just a representation.
Hi Can I have 3D coordinates of points in the mesh exactly (for example x, y, and z of the jaw in the mesh)? or can I have a new mesh with 3D coordinates of joints? I mean, can I have both in one representation? @Anirudh257
@masyrezaei You can have both of them separately but I am not sure what you mean by both in 1 representation.
I want x, y and z of jaw point, eye points, and ... on the mesh
@masyrezaei Sorry, I am new to the field and can't understand what you mean.
By mesh, do you mean, the 118 joints in 3D or 10000 vertices of trimesh?
Thank you, I mean 10000 vertices in mesh
@masyrezaei Okay, I don't know that but if you look at trimesh's documentation, you can get the required keypoints.
@Anirudh257 .Thanks. The output of SMPLify_X has the 3 folder images,meshes and results. What is the pkl file in results folder?For example, what is these? 'left_hand_pose': array([[ 0.22950487, 0.11978509, -0.00850148, 0.44743666, 0.19125924, 0.38689598, 0.06422849, -0.35365084, -0.15889902, -0.21843675, 0.57754153, -0.18176202]], dtype=float32), 'right_hand_pose': array([[ 0.00944458, 0.00724853, -0.0169796 , 0.01097591, 0.00143847, 0.00538751, 0.00771305, -0.00530685, 0.00413905, -0.00126342, 0.00113911, -0.00190026]], dtype=float32), 'jaw_pose': array([[0.10568943, 0.00164012, 0.00082748]], dtype=float32), 'leye_pose': array([[ 2.8327284 , 2.087815 , -0.50869274]], dtype=float32), 'reye_pose': array([[1.4676197 , 2.6551766 , 0.45708197]], dtype=float32). These are not keypoints, so what are they?
@mahsa1363, these are not key points. You can use these to extract the required keypoints following the steps:
# pkl_data is the data loaded from the pickle file
# Get the body_pose using the pose_embedding stored in the output
body_pose = vposer.decode(
torch.Tensor(pkl_data['body_pose']).cuda(),
output_type='aa').view(1, -1) if use_vposer else None
# Using the stored parameters, load the model
common_keys = list(set(list(body_model.state_dict().keys())).intersection(set(list(
pkl_data.keys()))))
for key in common_keys:
body_model.state_dict()[key][:] = torch.Tensor(pkl_data[key])
model_output = body_model(return_verts=True, body_pose=body_pose)
# Using the model output, get the joints (118 of the body, hands, face, etc)
joints = model_output.joints.detach().cpu().numpy()[0]
You can even refer the code https://github.com/vchoutas/smplify-x/blob/master/smplifyx/fit_single_frame.py
Thanks again, The 3D coordinates of the center of the sphere and the points of the human body are in the same coordinate? I mean are they using the global coordinates? I want to use فheir relative distance from each other
@masyrezaei I am not sure about that but I think that the 3D coordinates are the global distance in metres. 118 joints in 3D (x, y, z) are the distance in m of the person from the camera.
@Anirudh257 . Hello The output of SMPlify_x is 118 joints (including 25 body joints, 42 hands joints and 71 facial joints). So what are 55 joints
@mansooreh1 I don't get your question. The 71 facial joints of Openpose aren't taken entirely for SMPL-X face joints. The contour region is neglected.
@Anirudh257 As stated in the smplify_x article, SMPL-X uses standard vertexbased linear blend skinning with learned corrective blend shapes, has N = 10; 475 vertices and K = 54 joints. what is 54 joints?
@mansooreh1 Okay, sorry. I don't know these K joints. I am aware of the 3D final joints. What you are asking is the mesh level.
@Anirudh257 Thanks very much
@Anirudh257 Excuse me, do you know how to extract the rotation values, such as Euler angles or Quaternion?
@ZewanHuang Unfortunately, I haven't extracted them.
Hello,
Very impressive works! I am wondering how can I extract the 3D joint coordinates from the code? (and also the name of the corresponding joint)
I am trying this model_output = body_model(return_verts=True, body_pose=body_pose) joints = model_output.joints.detach().cpu().numpy().squeeze() and it returns a (118,3) arrays. How can I locate 3D joints coordinates from this (e.g., left-elbow, left-wrist....)
Thanks!!!