haitongli / knowledge-distillation-pytorch

A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
MIT License
1.86k stars 344 forks source link

missing training log for base cnn #13

Open hughperkins opened 5 years ago

hughperkins commented 5 years ago

https://github.com/peterliht/knowledge-distillation-pytorch/blob/master/experiments/base_cnn/train.log

2018-03-09 20:46:06,587:INFO: Loading the datasets...
2018-03-09 20:46:10,074:INFO: - done.
2018-03-09 20:46:10,078:INFO: Starting training for 30 epoch(s)
2018-03-09 20:51:27,485:INFO: Loading the datasets...
2018-03-09 20:51:30,918:INFO: - done.
2018-03-09 20:51:30,922:INFO: Starting training for 30 epoch(s)
2018-03-09 20:54:20,870:INFO: Loading the datasets...
2018-03-09 20:54:24,364:INFO: - done.
2018-03-09 20:54:24,368:INFO: Starting training for 30 epoch(s)
2018-03-09 20:54:24,368:INFO: Epoch 1/30