ssnl / dataset-distillation

Open-source code for paper "Dataset Distillation"
https://ssnl.github.io/dataset_distillation
MIT License
777 stars 115 forks source link

Is the training process on distilled data can be conducted in main.py #44

Closed NiaLiu closed 2 years ago

NiaLiu commented 2 years ago

Thanks for your great work.

I have a question regarding the code in the repo dataset-distillation. If I understood correctly, after distilling the image, we can train AlexCifarnet with the distilled Cifar10 data. and then conducting the test on the trained model with original Cifar10 test data.

I have gone over the code, however, I didn't find the snippet to train the distilled image. If the training process is actually presented in the code, could you please help me by noting down the command for training the distilled Cifar10 data after distilling the data?

Looking forward to your reply and hope you have an amazing day! Thank you and best regards, Dai

NiaLiu commented 2 years ago

If someone has the same issure as mine, make sure to check the --distill_lr is set correctly

ssnl commented 2 years ago

Training of distilled images is at https://github.com/SsnL/dataset-distillation/blob/master/train_distilled_image.py

Applying distilled images, i.e., using them to train new networks, is at https://github.com/SsnL/dataset-distillation/blob/3118db69435c09a6ba46c90b696a453e00190514/basics.py#L57-L83

NiaLiu commented 2 years ago

Training of distilled images is at https://github.com/SsnL/dataset-distillation/blob/master/train_distilled_image.py

Applying distilled images, i.e., using them to train new networks, is at

https://github.com/SsnL/dataset-distillation/blob/3118db69435c09a6ba46c90b696a453e00190514/basics.py#L57-L83

Thanks for the detailed instruction (Thumbs up), really appreciate it!