dongzelian / SSF

[NeurIPS'22] This is an official implementation for "Scaling & Shifting Your Features: A New Baseline for Efficient Model Tuning".
https://arxiv.org/pdf/2210.08823.pdf
MIT License
170 stars 12 forks source link

SSF for Efficient Model Tuning

This repo is the official implementation of our NeurIPS2022 paper "Scaling & Shifting Your Features: A New Baseline for Efficient Model Tuning" (arXiv).

Usage

Install

git clone https://github.com/dongzelian/SSF.git
cd SSF
conda create -n ssf python=3.7 -y
conda activate ssf
conda install pytorch==1.7.1 torchvision==0.8.2 cudatoolkit=10.1 -c pytorch
pip install timm==0.6.5
pip install -r requirements.txt

Data preparation

You can follow VPT to download them.

Since the original vtab dataset is processed with tensorflow scripts and the processing of some datasets is tricky, we also upload the extracted vtab-1k dataset in onedrive for your convenience. You can download from here and then use them with our vtab.py directly. (Note that the license is in vtab dataset).

Prepare ImageNet-A, ImageNet-R and ImageNet-C for evaluation.

Pre-trained model preparation

Fine-tuning a pre-trained model via SSF

To fine-tune a pre-trained ViT model via SSF on CIFAR-100 or ImageNet-1K, run:

bash train_scripts/vit/cifar_100/train_ssf.sh

or

bash train_scripts/vit/imagenet_1k/train_ssf.sh

You can also find the similar scripts for Swin, ConvNext, and AS-MLP models. You can easily reproduce our results. Enjoy!

Robustness & OOD

To evaluate the performance of fine-tuned model via SSF on Robustness & OOD, run:

bash train_scripts/vit/imagenet_a(r, c)/eval_ssf.sh

Citation

If this project is helpful for you, you can cite our paper:

@InProceedings{Lian_2022_SSF,
  title={Scaling \& Shifting Your Features: A New Baseline for Efficient Model Tuning},
  author={Lian, Dongze and Zhou, Daquan and Feng, Jiashi and Wang, Xinchao},
  booktitle={Advances in Neural Information Processing Systems (NeurIPS)},
  year={2022}
}

Acknowledgement

The code is built upon timm. The processing of the vtab-1k dataset refers to vpt, vtab github repo, and NOAH.