pixelite1201 / BEDLAM

206 stars 19 forks source link

Offset when editing mesh files #50

Closed davidpagnon closed 1 month ago

davidpagnon commented 1 month ago

Hi,

Right now, I can import the Bedlam meshes in Blender, and then move and orient them in the scene, in accordance with the specifications provided in the be_seq.csv files. However for various reasons, I would like to transform the meshes so that they are already in the right location and orientation.

I have been stuck on it for a little while: it is mostly okay and hardly noticeable in most cases, but in some others there is up to a ~ 5 cm offset between the transformed mesh and the place where it should be. I cannot see any mistake in my calculations, and I would be surprised if it were a rounding error.

What am I doing wrong?

Thank you for your attention! (and if I'm not in the right place, would you mind giving me the contact info of a person who could help me?). Best regards,

image


My pipeline:

  1. Bedlam (Unreal) coordinate system to SMPL:

    SMPL Blender Unreal
    x x x
    z -y y
    y (up) z z

    So to go from Bedlam to SMPL system, we need to do

    • translation x y z -> x z y
    • rotation z -> -y
  2. Then I want to rotate and translate the mesh data:

    • new_rot = rot_to_apply @ rot_orig
    • new_trans = rot_to_apply @ trans_orig + trans_to_apply

Minimal working example (I load the meshes in Blender with the SMPl-X add-on):

from 20221010_3_1000_batch01hand/be_seq.csv:

Index,Type,Body,X,Y,Z,Yaw,Pitch,Roll,Comment
0,Comment,None,0,0,0,0,0,0,bodies_min=3;bodies_max=3;x_offset=650;y_offset=0.0;z_offset=0.0;x_min=-50;x_max=50;y_min=-250;y_max=250;yaw_min=0;yaw_max=360;cam_x_offset=10.0;cam_y_offset=10.0;cam_z_offset=5.0;cam_yaw_min=-3;cam_yaw_max=3;cam_pitch_min=-10;cam_pitch_max=3;cam_roll_min=-3;cam_roll_max=3;cam_config=cam_random_e
4,Body,rp_cindy_posed_005_1097,684.9980306666062,-54.09495558334399,0.0,103.42616917612702,0.0,0.0,start_frame=65;texture_body=skin_f_indian_10_ALB;texture_clothing=texture_06
body_X, body_Y, body_Z = np.array([684.9980306666062,-54.09495558334399,0.0])/100
body_Yaw, body_Pitch, body_Roll = np.radians([103.42616917612702,0.0,0.0])
  1. FIRST APPROACH: Loading the original mesh, and moving it within Blender: I just load the mesh, and relocate it like so:

    • location X, Y, Z = body_X, -body_Y, body_Z
    • rotation Z = -body_Yaw
  2. SECOND APPROACH: Transforming the original mesh, and loading this newly created mesh into Blender:

    import numpy as np
    from scipy.spatial.transform import Rotation as R
    
    smpl_mesh = dict(np.load('rp_cindy_posed_005/1097/motion_seq.npz'))
    
    trans_to_apply = np.array([body_X, body_Z, body_Y])
    rot_to_apply = R.from_euler('y', -body_Yaw).as_matrix()
    
    for t in range(len(smpl_mesh['poses'])):
        rot_orig = R.from_rotvec(np.array( smpl_mesh['poses'][t, :3])).as_matrix()
        trans_orig = np.array(smpl_mesh['trans'][t])
    
        smpl_mesh['poses'][t,:3] = R.from_matrix(rot_to_apply @ rot_orig).as_rotvec()
        smpl_mesh['trans'][t] = rot_to_apply @ trans_orig + trans_to_apply 
    
    np.savez('rp_cindy_posed_005/1097/motion_seq_transformed.npz', **smpl_mesh)

    And then I import the mesh. And I get that little offset.

I would much appreciate any help or suggestion :)

davidpagnon commented 1 month ago

Here is the animated smpl-x file so that it is easier to test. rp_cindy_posed_005.zip

PS: I'm pretty sure that the first approach is the one giving correct results, since the mesh overlays perfectly the background image. So the error must be in the second approach, and I'm not sure if it is an error in my geometrical understanding, a rounding error, or something else

davidpagnon commented 1 month ago

I tried with homographic coordinates just out of desperation, but as expected, the results are the exact same:

smpl_mesh = dict(np.load('rp_cindy_posed_005/1097/motion_seq.npz'))

rot_to_apply = R.from_euler('y', -body_Yaw).as_matrix()
trans_to_apply = np.array([body_X, body_Z, body_Y]).reshape(3,1)
H_to_apply = np.block( [[ rot_to_apply, trans_to_apply], [0,0,0,1]])

for t in range(len(smpl_mesh['poses'])):
    rot_orig = R.from_rotvec(np.array( smpl_mesh['poses'][t, :3])).as_matrix()
    trans_orig = np.array(smpl_mesh['trans'][t]).reshape(3,1)
    H_orig = np.block( [[ rot_orig, trans_orig], [0,0,0,1]])

    H_new = H_to_apply @ H_orig

    smpl_mesh['poses'][t,:3] = R.from_matrix( H_new[:3,:3]).as_rotvec()
    smpl_mesh['trans'][t] = H_new[:3,3]

np.savez('rp_cindy_posed_005/1097/motion_seq_transformed.npz', **smpl_mesh)
davidpagnon commented 1 month ago

Okay, I think I finally pinpointed the problem: The pelvis location is dependent on the shape. That's why I had very good results on people whose shape was very close to the one of the average SMPL model, and a much worse one when the person was fatter/skinnier or short/tall.

Now, my next problem is finding the position of the pelvis as a function of the shape parameters.

davidpagnon commented 1 month ago

Here is the final script, and it works!!

import numpy as np
import json
from scipy.spatial.transform import Rotation as R

smpl_mesh = dict(np.load('rp_cindy_posed_005/1097/motion_seq.npz'))
regressor_path = r'C:\Users\david\AppData\Roaming\Blender Foundation\Blender\4.0\scripts\addons\smplx_blender_addon\data\smplx_betas_to_joints_neutral_300.json'

# Trans and rot from Bedlam
trans_to_apply = np.array([body_X, body_Z, body_Y])
rot_to_apply = R.from_euler('y', -body_Yaw).as_matrix()

# Trans pelvis from SMPL shape
with open(regressor_path) as f:
    data = json.load(f)
betas_to_joints, template_j = np.asarray(data["betasJ_regr"]), np.asarray(data["template_J"])
betas = np.concatenate([smpl_mesh['betas'], np.zeros(300-len(smpl_mesh['betas']))])
joint_locations = betas_to_joints @ betas + template_j
trans_pelvis_beta = joint_locations[0]

for t in range(len(smpl_mesh['poses'])):
    # Trans and rot from SMPL trans and poses
    trans_orig = np.array(smpl_mesh['trans'][t])
    rot_orig = R.from_rotvec(np.array( smpl_mesh['poses'][t, :3])).as_matrix()

    # Apply transformation
    smpl_mesh['poses'][t,:3] = R.from_matrix(rot_to_apply @ rot_orig).as_rotvec()
    smpl_mesh['trans'][t] = rot_to_apply @ (trans_orig + trans_pelvis_beta) + (trans_to_apply - trans_pelvis_beta)

np.savez('rp_cindy_posed_005/1097/motion_seq_transformed.npz', **smpl_mesh)

Since it is never over, there is one last step I need to do. As I am not actually interested in the SMPL-X files but in the SMPL+H ones, I need to find this regressor for SMPL+H.

pixelite1201 commented 1 month ago

Hello,

Sorry for the delay in reply but you made some great progress and it seems you are on right path :) Thanks for such a detailed post, it helps me understand the problem. I also struggled with it at some point and here is my solution.

Applying rotation to pelvis joint (j0) as it is not at origin. 

This is how rotation is applied in SMPL/-X
v_out = R_o(v_template - j0) + j0 --> pelvis is moved to origin, then rotated and then moved back

Case 1: Applying rotation to vetices using Rx
v1 = Rx[v_out]
v1 = R_x[R_o(v_template-j0) + j0] 
v1 = R_x.R_o.v_tempalte - R_x.R_o.j0 + R_x.j0 

Case 2: Applying rotation to global orientation using Rx:
v2 = (R_x.R_o)(v_template-j0) + j0 
v2 = R_x.R_o.v_tempalte - R_x.R_o.j0 + j0 

--> Now to make v2 same as v1, we need to add following
diff = R_x.j0 - j0
--> so v2 should be
v2 =  R_x.R_o.v_tempalte - R_x.R_o.j0  + diff

So basically when we apply some rotation to global orientation, we need to add translation shift (diff) as well.

Now I see that you tried to get the pelvis location using betas to joints regressor. You could also use smplx library to get joints. Please check out this code. Also it supports SMPL-H model as well. You need to just change the model-flag and model path while loading here. I hope I understood your problem correctly. Good luck.

davidpagnon commented 1 month ago

Thanks! I believe we did it quite similarly, although I got j0 from a regressor instead of the smplx library. Good to know that it also handles SMPL+H models!