open-mmlab / mmhuman3d

OpenMMLab 3D Human Parametric Model Toolbox and Benchmark
https://mmhuman3d.readthedocs.io/
Apache License 2.0
1.25k stars 137 forks source link



[![Documentation](https://readthedocs.org/projects/mmhuman3d/badge/?version=latest)](https://mmhuman3d.readthedocs.io/en/latest/?badge=latest) [![actions](https://github.com/open-mmlab/mmhuman3d/workflows/build/badge.svg)](https://github.com/open-mmlab/mmhuman3d/actions) [![codecov](https://codecov.io/gh/open-mmlab/mmhuman3d/branch/main/graph/badge.svg)](https://codecov.io/gh/open-mmlab/mmhuman3d) [![PyPI](https://img.shields.io/pypi/v/mmhuman3d)](https://pypi.org/project/mmhuman3d/) [![LICENSE](https://img.shields.io/github/license/open-mmlab/mmhuman3d.svg)](https://github.com/open-mmlab/mmhuman3d/blob/main/LICENSE) [![Percentage of issues still open](https://isitmaintained.com/badge/open/open-mmlab/mmhuman3d.svg)](https://github.com/open-mmlab/mmhuman3d/issues)

Introduction

English | 简体中文

MMHuman3D is an open-source PyTorch-based codebase for the use of 3D human parametric models in computer vision and computer graphics. It is a part of the OpenMMLab project.

The main branch works with PyTorch 1.7+.

If you are interested in multi-view motion capture, please refer to XRMoCap for more details.

https://user-images.githubusercontent.com/62529255/144362861-e794b404-c48f-4ebe-b4de-b91c3fbbaa3b.mp4

Major Features

News

Benchmark and Model Zoo

More details can be found in model_zoo.md.

Supported body models:

(click to collapse) - [x] [SMPL](https://smpl.is.tue.mpg.de/) (SIGGRAPH Asia'2015) - [x] [SMPL-X](https://smpl-x.is.tue.mpg.de/) (CVPR'2019) - [x] [MANO](https://mano.is.tue.mpg.de/) (SIGGRAPH ASIA'2017) - [x] [FLAME](https://flame.is.tue.mpg.de/) (SIGGRAPH ASIA'2017) - [x] [STAR](https://star.is.tue.mpg.de/) (ECCV'2020)

Supported methods:

(click to collapse) - [x] [SMPLify](https://smplify.is.tue.mpg.de/) (ECCV'2016) - [x] [SMPLify-X](https://smpl-x.is.tue.mpg.de/) (CVPR'2019) - [x] [HMR](https://akanazawa.github.io/hmr/) (CVPR'2018) - [x] [SPIN](https://www.seas.upenn.edu/~nkolot/projects/spin/) (ICCV'2019) - [x] [VIBE](https://github.com/mkocabas/VIBE) (CVPR'2020) - [x] [HybrIK](https://jeffli.site/HybrIK/) (CVPR'2021) - [x] [PARE](https://pare.is.tue.mpg.de/) (ICCV'2021) - [x] [DeciWatch](https://ailingzeng.site/deciwatch) (ECCV'2022) - [x] [SmoothNet](https://ailingzeng.site/smoothnet) (ECCV'2022) - [x] [ExPose](https://expose.is.tue.mpg.de) (ECCV'2020) - [x] [BalancedMSE](https://sites.google.com/view/balanced-mse/home) (CVPR'2022) - [x] [PyMAF-X](https://www.liuyebin.com/pymaf-x/) (arXiv'2022) - [x] [ExPose](configs/expose) (ECCV'2020) - [x] [PyMAF-X](configs/pymafx) (arXiv'2022) - [x] [CLIFF](configs/cliff) (ECCV'2022)

Supported datasets:

(click to collapse) - [x] [3DPW](https://virtualhumans.mpi-inf.mpg.de/3DPW/) (ECCV'2018) - [x] [AGORA](https://agora.is.tue.mpg.de/) (CVPR'2021) - [x] [AMASS](https://amass.is.tue.mpg.de/) (ICCV'2019) - [x] [COCO](https://cocodataset.org/#home) (ECCV'2014) - [x] [COCO-WholeBody](https://github.com/jin-s13/COCO-WholeBody) (ECCV'2020) - [x] [CrowdPose](https://github.com/Jeff-sjtu/CrowdPose) (CVPR'2019) - [x] [EFT](https://github.com/facebookresearch/eft) (3DV'2021) - [x] [GTA-Human](https://caizhongang.github.io/projects/GTA-Human/) (arXiv'2021) - [x] [Human3.6M](http://vision.imar.ro/human3.6m/description.php) (TPAMI'2014) - [x] [InstaVariety](https://github.com/akanazawa/human_dynamics/blob/master/doc/insta_variety.md) (CVPR'2019) - [x] [LSP](https://sam.johnson.io/research/lsp.html) (BMVC'2010) - [x] [LSP-Extended](https://sam.johnson.io/research/lspet.html) (CVPR'2011) - [x] [MPI-INF-3DHP](http://gvv.mpi-inf.mpg.de/3dhp-dataset/) (3DC'2017) - [x] [MPII](http://human-pose.mpi-inf.mpg.de/) (CVPR'2014) - [x] [Penn Action](http://dreamdragon.github.io/PennAction/) (ICCV'2012) - [x] [PoseTrack18](https://posetrack.net/users/download.php) (CVPR'2018) - [x] [SURREAL](https://www.di.ens.fr/willow/research/surreal/data/) (CVPR'2017) - [x] [UP3D](https://files.is.tuebingen.mpg.de/classner/up/) (CVPR'2017) - [x] [FreiHand](https://lmb.informatik.uni-freiburg.de/projects/freihand/) (ICCV'2019) - [x] [EHF](https://smpl-x.is.tue.mpg.de/) (CVPR'2019) - [x] [Stirling/ESRC-Face3D](http://pics.psych.stir.ac.uk/ESRC/index.htm) (FG'2018)

We will keep up with the latest progress of the community, and support more popular methods and frameworks.

If you have any feature requests, please feel free to leave a comment in the wishlist.

Get Started

Please see getting_started.md for the basic usage of MMHuman3D.

License

This project is released under the Apache 2.0 license. Some supported methods may carry additional licenses.

Citation

If you find this project useful in your research, please consider citing:

@misc{mmhuman3d,
    title={OpenMMLab 3D Human Parametric Model Toolbox and Benchmark},
    author={MMHuman3D Contributors},
    howpublished = {\url{https://github.com/open-mmlab/mmhuman3d}},
    year={2021}
}

Contributing

We appreciate all contributions to improve MMHuman3D. Please refer to CONTRIBUTING.md for the contributing guideline.

Acknowledgement

MMHuman3D is an open source project that is contributed by researchers and engineers from both the academia and the industry. We appreciate all the contributors who implement their methods or add new features, as well as users who give valuable feedback. We wish that the toolbox and benchmark could serve the growing research community by providing a flexible toolkit to reimplement existing methods and develop their own new models.

Projects in OpenMMLab