A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
609
stars
58
forks
source link
Making a pipeline for Pruning, Quantization and Knowledge Distillation #107
Open
Het-Shah opened 3 years ago
Currently, the user has to import and run all three things independently. Having a pipeline will make the entire work streamlined for the end-user.
The user should be able to add pipelines like [KD, Pruning] or [KD, Quantization].
We can discuss how we want to go about doing this.