SforAiDl / KD_Lib

A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
https://kd-lib.readthedocs.io/
MIT License
605 stars 58 forks source link

Paper: Data-Distortion Guided Self-Distillation for Deep Neural Networks #127

Open yiqings opened 2 years ago

yiqings commented 2 years ago

Description

1. A self distillation scheme built upon distilling different augmented/distorted images by the same student. 
2.A MMD loss distilling the features between different augmented/distorted images

Modifications

Probably removing the MMD loss and only retain the KL loss is fine,
since it can already demonstrate competitive performance.

The methods shows to be a very powerful self-distillation scheme, even with the absence of MMD loss, with my my local experiments on CIFAR10/100.

Plus, it also demonstrate a strong compatibility with other distillation scheme, and can perform as a component.

yiqings commented 2 years ago

https://github.com/youngerous/ddgsd-pytorch provides an unofficial implementation.

NeelayS commented 2 years ago

Hi @yiqings, thanks for raising this issue. Unfortunately, development for KD-Lib has stalled for now, but we will be sure to keep this issue in mind when / if we resume. Also, do let me know if you would be interested in contributing an implementation for this paper.