changlin31 / BossNAS

(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
136 stars 20 forks source link
automl hybrid-neural-network nas neural-architecture-search self-supervised-learning transformer

BossNAS

PWC
PWC
PWC

This repository contains PyTorch code and pretrained models of our paper: BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search (ICCV 2021).

Illustration of the Siamese supernets training with ensemble bootstrapping.

Illustration of the fabric-like Hybrid CNN-transformer Search Space with flexible down-sampling positions.

Our Results and Trained Models

Usage

1. Requirements

2. Retrain or Evaluate our BossNet-T models

Architecture of our BossNet-T0

3. Evaluate architecture rating accuracy of BossNAS

4. Search Architecture with BossNAS

First, go to the searching code directory:

cd searching

Citation

If you use our code for your paper, please cite:

@inproceedings{li2021bossnas,
  author = {Li, Changlin and
            Tang, Tao and
            Wang, Guangrun and
            Peng, Jiefeng and
            Wang, Bing and
            Liang, Xiaodan and
            Chang, Xiaojun},
  title = {{B}oss{NAS}: Exploring Hybrid {CNN}-transformers with Block-wisely Self-supervised Neural Architecture Search},
  booktitle = {ICCV},
  year = 2021,
}