Closed NiaLiu closed 2 years ago
If someone has the same issure as mine, make sure to check the --distill_lr is set correctly
Training of distilled images is at https://github.com/SsnL/dataset-distillation/blob/master/train_distilled_image.py
Applying distilled images, i.e., using them to train new networks, is at https://github.com/SsnL/dataset-distillation/blob/3118db69435c09a6ba46c90b696a453e00190514/basics.py#L57-L83
Training of distilled images is at https://github.com/SsnL/dataset-distillation/blob/master/train_distilled_image.py
Applying distilled images, i.e., using them to train new networks, is at
Thanks for the detailed instruction (Thumbs up), really appreciate it!
Thanks for your great work.
I have a question regarding the code in the repo dataset-distillation. If I understood correctly, after distilling the image, we can train AlexCifarnet with the distilled Cifar10 data. and then conducting the test on the trained model with original Cifar10 test data.
I have gone over the code, however, I didn't find the snippet to train the distilled image. If the training process is actually presented in the code, could you please help me by noting down the command for training the distilled Cifar10 data after distilling the data?
Looking forward to your reply and hope you have an amazing day! Thank you and best regards, Dai