haitongli / knowledge-distillation-pytorch

A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
MIT License
1.86k stars 344 forks source link

cant get the pretrained model #1

Closed flygyyy closed 6 years ago

flygyyy commented 6 years ago

cant open the url to get the pretrained teacher model checkpoints, can you offer another way??

haitongli commented 6 years ago

an alternative folder link: https://office365stanford-my.sharepoint.com/:u:/g/personal/haitongl_stanford_edu/ESqCguWIU1JHoMJJyu8eDIABcp9UX5VzPp1VQvhzNjllGw?e=HMPjwA

let me know if this works (if you can access Box.com and/or Onedrive, both original README link and this one should work fine)

flygyyy commented 6 years ago

the link works fine, thanks