eml-eda / mixprec-pruning

3 stars 0 forks source link

Regarding code issues #2

Open tang12138-yyhh opened 19 hours ago

tang12138-yyhh commented 19 hours ago

How are the relationships between these codes, such as icl_PIT+mixprec_training_cost_size.py, icl_PIT_training.py, icl_mixprec_training_cost_size.py, and icl_mixprec_training_cost_general.py? What do each of these four files implement, and in icl_PIT+mixprec_training_cost_size.py, I see that the final_best_finetuning.ckp and final_best_search.ckp files are used, but when I run it, I find that these two files are missing. I wonder where these files came from?

bmots commented 10 hours ago

Hello @tang12138-yyhh,

each of those scripts can be used to train a model with a different technique, that I will detail below:

The proposed approach is also available in the new version of the PLiNIO library, with more regularizers and a more user-friendly structure. If you are interested, you can have a look here! The other files in the repository follow a similar structure, but refer to different benchmarks. The aforementioned ones consider CIFAR-10, the kws scripts consider the Google Speech Commands v2 dataset, the tin ones Tiny Imagenet and the imn one uses the ImageNet dataset.

Let us know if you have any further questions.