HobbitLong / RepDistiller

[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
BSD 2-Clause "Simplified" License
2.17k stars 395 forks source link

Compatibility for `torch==1.12.1` #55

Closed sieu-n closed 11 months ago

sieu-n commented 1 year ago

I checked that this works for recent versions(torch==1.12.1, torchvision==0.13.1) and old versions (torch==1.3.1, torchvision==0.4.2).

As mentioned in #14 , I made it backward-compatible with previous versions.

sieu-n commented 1 year ago

Code to quick test in COLAB:

!git clone https://github.com/krenerd/RepDistiller
%cd RepDistiller

!sh scripts/fetch_pretrained_teachers.sh
!pip install tensorboard_logger

!python train_student.py \
--path_t ./save/models/resnet32x4_vanilla/ckpt_epoch_240.pth \
--distill kd \
--model_s resnet8x4 \
-r 0.1 \
-a 0.9 \
-b 0 \
--trial 1

# backward compatibility
!pip install torchvision==0.4.2 
!python train_student.py \
--path_t ./save/models/resnet32x4_vanilla/ckpt_epoch_240.pth \
--distill kd \
--model_s resnet8x4 \
-r 0.1 \
-a 0.9 \
-b 0 \
--trial 1
HobbitLong commented 11 months ago

Thank you @sieu-n for the pull-request, Merged!