haitongli / knowledge-distillation-pytorch

A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
MIT License
1.87k stars 346 forks source link

Box Folder #21

Closed LesiChou closed 2 years ago

LesiChou commented 5 years ago

I can't download the box folder.Could someone send these files to my mailbox?Thank you so much!

parquets commented 4 years ago

I can't download the box folder.Could someone send these files to my mailbox?Thank you so much!

do you got the pretrain model?

haitongli commented 2 years ago

A new link for the "experiments" folder containing models has been updated in the readme.