NVlabs / DeepInversion

Official PyTorch implementation of Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion (CVPR 2020)
Other
479 stars 77 forks source link

Code for Other Experiments #3

Open gnobitab opened 4 years ago

gnobitab commented 4 years ago

Hi,

Your work is amazing! I'm wondering whether you are going to release the code for :1. Data-free pruning 2. Data-free Knowledge Transfer 3. Data-free Continual Learning.

Thanks!

pamolchanov commented 4 years ago

We have restrictions on providing trained models and this complicates full code release. Most likely we will release experiment for 2. Data-free Knowledge Transfer soon and will try to do the same for other experiments. Will keep this thread updated.

hungz23 commented 3 years ago

Hi, is code for knowledge distillation and Data-free Knowledge Transfer ready? Thanks.

dungdinhanh commented 3 years ago

We have restrictions on providing trained models and this complicates full code release. Most likely we will release experiment for 2. Data-free Knowledge Transfer soon and will try to do the same for other experiments. Will keep this thread updated.

Hi. Is the code for data-free knowledge transfer ready? Thanks!

Sharpiless commented 2 years ago

We have restrictions on providing trained models and this complicates full code release. Most likely we will release experiment for 2. Data-free Knowledge Transfer soon and will try to do the same for other experiments. Will keep this thread updated.

Still waiting for the code.

tallesbrito commented 1 year ago

Hi all,

Is there a full implementation of ADI (Adaptive DeepInversion) in this repository? Wouldn't I need to use knowledge distillation from teacher to student in order to enable ADI, as described in the paper?

I think the implementation of ADI provided with ImageNet experiments in this repository is incomplete (see --adi_scale option).

Can someone clarify this?