lixiny / bihand

✌🏻 BiHand: Recovering Hand Mesh with Multi-stage Bisected Hourglass Networks [BMVC 2020]
GNU General Public License v3.0
97 stars 16 forks source link

BiHand - 3D Hand Mesh Reconstruction

This repo contains model, demo, training codes for our paper: "BiHand: Recovering Hand Mesh with Multi-stage Bisected Hourglass Networks"(PDF) (BMVC2020)

Get the code

git clone --recursive https://github.com/lixiny/bihand.git
cd bihand

Install Requirements

Install the dependencies listed in environment.yml through conda:

The above operation works well if you are lucky. However, we found that installing opendr is tricky. We solved the errors by:

sudo apt-get install libglu1-mesa-dev freeglut3-dev mesa-common-dev
sudo apt-get install libosmesa6-dev
## then reinstall opendr again
pip install opendr

Download and Prepare Datasets

Now your data folder structure should like this:

data/
    RHD/
        RHD_published_v2/
            evaluation/
            training/
            view_sample.py
            ...

    STB/
        images/
            B1Counting/
                SK_color_0.png
                SK_depth_0.png
                SK_depth_seg_0.png  <-- merged from STB_supp
                ...
            ...
        labels/
            B1Counting_BB.mat
            ...

Download and Prepare model files

MANO model

Now Your manopth folder structure should look like this:

manopth/
  mano/
    models/
      MANO_LEFT.pkl
      MANO_RIGHT.pkl
      ...
  manopth/
    __init__.py
    ...

BiHand models

Now your bihand folder should look like this:

BiHand-test/
    bihand/
    released_checkpoints/
        β”œβ”€β”€ ckp_seednet_all.pth.tar
        β”œβ”€β”€ ckp_siknet_synth.pth.tar
        β”œβ”€β”€ rhd/
        β”‚Β Β  β”œβ”€β”€ ckp_liftnet_rhd.pth.tar
        β”‚Β Β  └── ckp_siknet_rhd.pth.tar
        └── stb/
            β”œβ”€β”€ ckp_liftnet_stb.pth.tar
            └── ckp_siknet_stb.pth.tar
    data/
    ...

Launch Demo & Eval

Training

By adopting the multi-stage training scheme, we first train SeedNet for 100 epochs:

python training/train_seednet.py --net_modules seed --datasets stb rhd --ups_loss

and then exploit its outputs to train LiftNet for another 100 epochs:

python training/train_liftnet.py \
    --net_modules seed lift \
    --datasets stb rhd \
    --resume_seednet_pth ${path_to_your_SeedNet_checkpoints (xxx.pth.tar)} \
    --ups_loss \
    --train_batch 16

For SIKNet:

e.g.

python training/train_siknet.py \ --fine_tune rhd \ --frozen_seednet_pth released_checkpoints/ckp_seednet_all.pth.tar \ --frozen_liftnet_pth released_checkpoints/rhd/ckp_liftnet_rhd.pth.tar \ --resume_siknet_pth released_checkpoints/ckp_siknet_synth.pth.tar


## Limitation

Currently the released version of bihand requires camera intrinsics, root depth and bone length as inputs, thus cannot be applied in the wild.

## Citation
If you find this work helpful, please consider citing us:

@inproceedings{yang2020bihand, title = {BiHand: Recovering Hand Mesh with Multi-stage Bisected Hourglass Networks}, author = {Yang, Lixin and Li, Jiasen and Xu, Wenqiang and Diao, Yiqun and Lu, Cewu}, booktitle = {BMVC}, year = {2020} }



## Acknowledgement

- Code of Mano Pytorch Layer in `manopth` was adapted from [manopth](https://github.com/hassony2/manopth).

- Code for evaluating the hand PCK and AUC in `bihand/eval/zimeval.py` was adapted from [hand3d](https://github.com/lmb-freiburg/hand3d).

- Code of data augmentation in `bihand/datasets/handataset.py` was adapted from [obman](https://hassony2.github.io/obman).

- Code of STB datasets `bihand/datasets/stb.py` was adapted from [hand-graph-cnn](https://github.com/3d-hand-shape/hand-graph-cnn).

- Code of the original Hourglass Network `bihand/models/hourglass.py` was adapted from [pytorch-pose](https://github.com/bearpaw/pytorch-pose).

- Thanks [Yuxiao Zhou](https://github.com/CalciferZh) for helpful discussions and suggestions when solving IK problem.