hygenie1228 / ClothWild_RELEASE

[ECCV 2022] This repo is official PyTorch implementation of 3D Clothed Human Reconstruction in the Wild.
174 stars 13 forks source link

parameters do not match #4

Closed keemoonjang closed 1 year ago

keemoonjang commented 1 year ago

Hello, thank you for sharing this intriguing work!

I have tried to follow the quick demo as described. While going through "Prepare SMPL parameter, as pose2pose_result.json. You can get the SMPL parameter by running the off-the-shelf method [code]." and trying to use the .json results for ClothWild input, I run into a key error.

My pose2pose results contain keys such as _smplx_root_pose, smplx_body_pose, smplxshape and so forth, while ClothWild sample input seems to require pose, shape, trans and focal, princpt under _smplparam and _camparam, respectively.

Another thing I find confusing is that there are two versions of SMPL pretrained weights _snapshot6.pth.tar. I assume one is body-only and the other is full-body weights, but saved with a same filename. I want to know how I should prepare the settings and match the parameters to successfully get results for ClothWild.

Thank you!

hygenie1228 commented 1 year ago

Thanks for your interest.

In Hand4Whole github repository, there are two branches: whole-body task[code]. and body-only task[code].

For running demo, you should follow demo file body-only codes in [here]

In detail, you can easily obtain json file for running ClothWild, by changing parts of demo codes as below: https://github.com/mks0601/Hand4Whole_RELEASE/blob/Pose2Pose/demo/body/demo_body.py#L89-L92

# save SMPL parameters
smpl_pose = out['smpl_pose'].detach().cpu().numpy()[0]
smpl_shape = out['smpl_shape'].detach().cpu().numpy()[0]
cam_trans = out['cam_trans'].detach().cpu().numpy()[0]
with open('smpl_param.json', 'w') as f:
    json.dump({ 'smpl_param':
                    {'pose': smpl_pose.reshape(-1).tolist(), 'shape': smpl_shape.reshape(-1).tolist(), 'trans': cam_trans.reshape(-1).tolist()},
                'cam_param':
                    {'focal': focal, 'princpt': princpt}
                }, f)

For the second question, you should download the pre-trained weight for 'body-only' in [here] https://github.com/mks0601/Hand4Whole_RELEASE/tree/Pose2Pose#quick-demo].

Thank you.

keemoonjang commented 1 year ago

Thank you for answering! Problems were solved following your instructions.

Adding from here, I came up with some follow-up questions:

  1. I was wondering if there is any way to reconstruct the head (face, hair, etc) with the current body-only reconstruction. (theoretically, smpl parameters based on a whole-body task would work but the parameters do not seem to match with clothwild code at current stage)
  2. In many examples I tested, the results are wearing t-shirts and capris (7부 바지) for top and bottom, respectively. How can I gain accuracy in getting the correct clothing type for in the wild images? Some examples below:

clothwild_example_2 clothwild_example_1

  1. SMPL parameters from pose2pose seems to generate a neutral model (assumed by its body shape). How can I fix it to produce a gender-specific model?

Thank you once again for sharing this amazing project :)

hygenie1228 commented 1 year ago

My answer to your question is as follows:

  1. Current codes can only cover body-only model SMPL because SMPLicit module of our framework is designed based on SMPL model.

  2. As you mentioned, our framework cannot reconstruct several cloth types, because our model is upper-bounded on cloth generative model; It is one of the limitations of our work.

  3. ClothWild estimates gender, so based on the estimated gender, ClothWild reconstructs a clothed human through a gender-specific model.

I hope these answers will be helpful to you.

keemoonjang commented 1 year ago

@hygenie1228 Thank you for your kind answers! I could get the state-of-the-art 3d reconstruction human models from single in the wild images. Looking forward to seeing the next steps (if any) with full body and more cloth types being accurately reconstructed :)