An adaptive training algorithm for residual network based on model Lipschitz
git clone --recursive https://github.com/shwinshaker/LipGrow.git
./data
directory which includes the datasets ./checkpoints
directory to save the training outputCIFAR-10/100
./launch.sh
Tiny-ImageNet
./imagenet-launch.sh
Recipes
grow=false
grow='fixed'
, and provide grow epochs dupEpoch
grow='adapt'
, and use adaptive cosine learning rate scheduler scheduler='adacosine'
If you find our algorithm helpful, consider citing our paper
Towards Adaptive Residual Network Training: A Neural-ODE Perspective
@inproceedings{Dong2020TowardsAR,
title={Towards Adaptive Residual Network Training: A Neural-ODE Perspective},
author={Chengyu Dong and Liyuan Liu and Zichao Li and Jingbo Shang},
year={2020}
}