ytrock / THuman2.0-Dataset

292 stars 8 forks source link

Why I got the wrong global orient results #12

Closed jhkim0759 closed 1 year ago

jhkim0759 commented 1 year ago

그림1

I rendered the obj with pytorch3d library and fit the SMPL skinned model using the parameters provided. but, I got the results as above. I think this is upside down result. Can you help me to get the correct results?

ytrock commented 1 year ago

Dear jhkim0759,

can you first try to generate the corresponding SMPL mesh and save it as .obj file, and then check the alignment between the two obj meshes using the Meshlab?

jhkim0759 commented 1 year ago

Thank you for your answer

I saved the mesh as obj and check between the data what you provided. but, I got the exactly opposite results as the image shown. I think I have to aligned the mesh one by one. or is there an other way to obtain the Global Orient from camera parameters(Entrinsic, Intrinsic, Rotation, etc.)?

ytrock commented 1 year ago

That's weird, when generating smpl-x meshes, did you use the provided poses which already include the global RT? Or you can paste your code for generating the smpl-x meshes at here.

jhkim0759 commented 1 year ago
smplr = SMPLR()

params_dir = "../dataset/THuman/THuman2.0_smplx/0525/smplx_param.pkl"

db = np.load(params_dir, allow_pickle=True)
dict_ = {}
for key in db:
    dict_[key] = torch.tensor(db[key])

global_ori = torch.from_numpy(dict_["global_orient"])[0]
pose = torch.cat([global_ori, torch.from_numpy(dict_["body_pose"][0])],0)
verts = smplr(pose.reshape(1,-1,3,3).float(),torch.from_numpy(beta).reshape(1,-1).float(), pose2rot=False)["verts"]

renderer = Renderer(resolution=(1024, 1024), orig_img=True, wireframe=False )
img = renderer.render(img, verts[0], [1,1,0,0])

This the code what is used. The SMPLR Model is from ROMP, and Renderer is from PARE.

Thank you

jhkim0759 commented 1 year ago

I think the SMPL regressor is based on vibe form. That's why the verts show the opposite. Thank you

ytrock commented 1 year ago

You are welcome :) Thanks for your attention.