iamkanghyunchoi / ait

It's All In the Teacher: Zero-Shot Quantization Brought Closer to the Teacher [CVPR 2022 Oral]
GNU General Public License v3.0
30 stars 4 forks source link
cvpr pytorch quantization zero-shot

It's All In the Teacher: Zero-Shot Quantization Brought Closer to the Teacher [CVPR 2022 Oral]

This folder contains the official implementation of paper It's All In the Teacher: Zero-Shot Quantization Brought Closer to the Teacher on GDFQ, Qimera, AutoReCon framework.

AIT Performance comparison

Requirements

Setup

We recommend using Python virtual environment to run this code.

You can install requirements with the command below.

pip install -r requirements.txt

Folder Structure

ait_code
├── figs
├── AutoReCon_AIT
│   ├── main.py
│   ├── optimizer.py                                # GI implementation
│   ├── option.py 
│   ├── trainer.py
│   ├── {DATASET}_{NETWORK}.hocon                   # Setting files
│   ├── run_{DATASET}_{NETWORK}_{BITWIDTH}bit.sh    # Train scripts
│   ├── trainer.py
│   └── ...                                         # Utils
├── GDFQ_AIT
│   └── ...                                         # Similar to above
├── Qimera_AIT
│   └── ...                                         # Similar to above
├── LICENSE.md
├── README.md
└── requirements.txt

Training

For Imagenet training, change the path of the validation set in .hocon file. To train the model described in the paper, run one of this command:

./run_cifar10_4bit.sh
./run_cifar100_4bit.sh
./run_imgnet_resnet18_4bit.sh
./run_imgnet_resnet50_4bit.sh
./run_imgnet_mobilenet_v2_4bit.sh

The script name is same for all experiment framework.

Major Arguments

License

This project is licensed under the terms of the GNU General Public License v3.0