3D Clothed Human Reconstruction in the Wild,
Gyeongsik Moon, Hyeongjin Nam, Takaaki Shiratori, Kyoung Mu Lee ( equal contribution)
European Conference on Computer Vision (ECCV), 2022*
sh requirements.sh
. You should slightly change torchgeometry
kernel code following here.demo
folder.base_data
folder following below Directory
part.input.png
and edit its bbox
of demo/demo.py
.pose2pose_result.json
. You can get the SMPL parameter by running the off-the-shelf method [code].python demo.py --gpu 0
.Refer to here.
In the main/config.py
, you can change datasets to use.
cd ${ROOT}/main
python train.py --gpu 0
Place trained model at the output/model_dump
and follow below.
To evaluate CD (Chamfer Distance) on 3DPW, run
cd ${ROOT}/main
python test.py --gpu 0 --test_epoch 7 --type cd
To evaluate BCC (Body-Cloth Correspondence) on MSCOCO, run
cd ${ROOT}/main
python test.py --gpu 0 --test_epoch 7 --type bcc
You can download the checkpoint trained on MSCOCO+DeepFashion2 from here.
Refer to the paper's main manuscript and supplementary material for diverse qualitative results!
@InProceedings{Moon_2022_ECCV_ClothWild,
author = {Moon, Gyeongsik and Nam, Hyeongjin and Shiratori, Takaaki and Lee, Kyoung Mu},
title = {3D Clothed Human Reconstruction in the Wild},
booktitle = {European Conference on Computer Vision (ECCV)},
year = {2022}
}