open-mmlab / mmrazor

OpenMMLab Model Compression Toolbox and Benchmark.
https://mmrazor.readthedocs.io/en/latest/
Apache License 2.0
1.46k stars 227 forks source link
autoslim classification darts detection knowledge-distillation nas pruning pytorch quantization segmentation spos
 
OpenMMLab website HOT      OpenMMLab platform TRY IT OUT
 
[![PyPI](https://img.shields.io/pypi/v/mmrazor)](https://pypi.org/project/mmrazor) [![docs](https://img.shields.io/badge/docs-latest-blue)](https://mmrazor.readthedocs.io/en/latest/) [![badge](https://github.com/open-mmlab/mmrazor/workflows/build/badge.svg)](https://github.com/open-mmlab/mmrazor/actions) [![codecov](https://codecov.io/gh/open-mmlab/mmrazor/branch/master/graph/badge.svg)](https://codecov.io/gh/open-mmlab/mmrazor) [![license](https://img.shields.io/github/license/open-mmlab/mmrazor.svg)](https://github.com/open-mmlab/mmrazor/blob/master/LICENSE) [![open issues](https://isitmaintained.com/badge/open/open-mmlab/mmrazor.svg)](https://github.com/open-mmlab/mmrazor/issues) [![issue resolution](https://isitmaintained.com/badge/resolution/open-mmlab/mmrazor.svg)](https://github.com/open-mmlab/mmrazor/issues) [📘Documentation](https://mmrazor.readthedocs.io/en/latest/) | [🛠️Installation](https://mmrazor.readthedocs.io/en/latest/get_started/installation.html) | [👀Model Zoo](https://mmrazor.readthedocs.io/en/latest/get_started/model_zoo.html) | [🤔Reporting Issues](https://github.com/open-mmlab/mmrazor/issues/new/choose)
English | [简体中文](README_zh-CN.md)

:star: MMRazor for Large Models is Available Now! Please refer to MMRazorLarge

Introduction

MMRazor is a model compression toolkit for model slimming and AutoML, which includes 4 mainstream technologies:

It is a part of the OpenMMLab project.

Major features:

About MMRazor's design and implementation, please refer to tutorials for more details.

Latest Updates

The default branch is now main and the code on the branch has been upgraded to v1.0.0. The old master branch code now exists on the 0.x branch

MMRazor v1.0.0 was released in 2023-4-24, Major updates from 1.0.0rc2 include:

  1. MMRazor quantization is released.
  2. Add a new pruning algorithm named GroupFisher.
  3. Support distilling rtmdet with MMRazor.

To know more about the updates in MMRazor 1.0, please refer to Changelog for more details!

Benchmark and model zoo

Results and models are available in the model zoo.

Supported algorithms:

Neural Architecture Search - [x] [DARTS(ICLR'2019)](configs/nas/mmcls/darts) - [x] [DetNAS(NeurIPS'2019)](configs/nas/mmdet/detnas) - [x] [SPOS(ECCV'2020)](configs/nas/mmcls/spos)
Pruning - [x] [AutoSlim(NeurIPS'2019)](/configs/pruning/mmcls/autoslim) - [x] [L1-norm](/configs/pruning/mmcls/l1-norm) - [x] [Group Fisher](/configs/pruning/base/group_fisher) - [x] [DMCP](/configs/pruning/mmcls/dmcp)
Knowledge Distillation - [x] [CWD(ICCV'2021)](/configs/distill/mmdet/cwd) - [x] [WSLD(ICLR'2021)](/configs/distill/mmcls/wsld) - [x] [ABLoss](/configs/distill/mmcls/abloss) - [x] [BYOT](/configs/distill/mmcls/byot) - [x] [DAFL](/configs/distill/mmcls/dafl) - [x] [DFAD](/configs/distill/mmcls/dfad) - [x] [DKD](/configs/distill/mmcls/dkd) - [x] [Factor Transfer](/configs/distill/mmcls/factor_transfer) - [x] [FitNets](/configs/distill/mmcls/fitnets) - [x] [KD](/configs/distill/mmcls/kd) - [x] [OFD](/configs/distill/mmcls/ofd) - [x] [RKD](/configs/distill/mmcls/rkd) - [x] [ZSKT](/configs/distill/mmcls/zskt) - [x] [FBKD](/configs/distill/mmdet/fbkd)
Quantization - [x] [PTQ](/configs/quantization/ptq/base) - [x] [QAT](/configs/quantization/qat/base) - [x] [LSQ](/configs/quantization/qat/lsq)

Installation

MMRazor depends on PyTorch, MMCV and MMEngine.

Please refer to installation.md for more detailed instruction.

Getting Started

Please refer to user guides for the basic usage of MMRazor. There are also advanced guides:

Contributing

We appreciate all contributions to improve MMRazor. Please refer to CONTRUBUTING.md for the contributing guideline.

Acknowledgement

MMRazor is an open source project that is contributed by researchers and engineers from various colleges and companies. We appreciate all the contributors who implement their methods or add new features, as well as users who give valuable feedbacks. We wish that the toolbox and benchmark could serve the growing research community by providing a flexible toolkit to reimplement existing methods and develop their own new model compression methods.

Citation

If you find this project useful in your research, please consider cite:

@misc{2021mmrazor,
    title={OpenMMLab Model Compression Toolbox and Benchmark},
    author={MMRazor Contributors},
    howpublished = {\url{https://github.com/open-mmlab/mmrazor}},
    year={2021}
}

License

This project is released under the Apache 2.0 license.

Projects in OpenMMLab