Closed TeerathChandani closed 5 years ago
There were many experiments done. For example, you can verify these(-> means distillation):
Dataset: CIFAR-100
Accuracy of Resnet110: 72.38
Accuracy of Resnet20: 67.03
Accuracy of Resnet110 -> Resnet20: 68.11 [seed=55, lambda_student=0.5, T_student=5]
Accuracy of Resnet8: 61.37
Accuracy of Resnet110 -> Resnet8: 61.41 [seed=31, lambda_student=0.2, T_student=2]
Accuracy of Resnet110 -> Resnet20 -> Resnet8: 61.82 [seed=31, lambda_student=0.15, T_student=5]
Thanks for your quick response.
First
Waiting for your response..! I am asking values, only because nnictl command does not work. That is why, I need your help.
Thanks
@TeerathKumar142 The result I mentioned were not for this table and were for cifar100 dataset (table 3, row2 in the paper) For your first question: "It means, you made multiple combinations of those 3 values of each hyper-parameters.. then you have found that which are optimized.? Second" , the answer is Yes and No. Teacher Assistant Knowledge Distillation is a multi step process. The first step is to distillate knowledge from teacher to TA and the second step is to distillate knowledge from TA to the student. So, it is basically doing knowledge distillation 2 times and each time we do it with hyper parameters.
For your second question, I have to check them later(max in 2 days) because I'm currently busy doing another project.
Can you share your email? I have to discuss regarding this project personally?
Can you share the weights of the teacher (CIFAR10 and CIFAR100) only? If it is possible?
Or just share all hyperparameter values that you used in the experiement.?
For table 4 of the paper, here are the hyper-parameters for the experiment and the final weights.
First Row(Resnet 26 Teacher, Resnet 20 TA, Resnet 14 student)
step1: resnet26 -> resnet 20
lambda_student: 0.95
seed: 31
T_student: 5
step2: Resnet20 -> Resnet 14:
lambda_student: 0.97
seed: 55
T_student: 10
Weight File of the final student(resnet14 with accuracy 91.23):
https://www.dropbox.com/s/41f4eamebqhh68b/resnet_26_20_14_2572043ab83f4878bb3c17560fa45f3d_best.pth.tar?dl=0
Second Row(Resnet 26 Teacher, Resnet 14 TA, Resnet 8 student)
step1: Resnet 26-> Resnet 14
lambda_student: 0.95
seed: 31
T_student: 5
step2: Resnet 14 -> Resnet 8
lambda_student: 0.49
seed: 55
T_student: 10
Weight file of the final student(resnet 8 with the accuracy of 88.01):
https://www.dropbox.com/s/ulbdftjf43p21c5/resnet_26_14_8_4c3b4d0befca4633bb1265023d14e1ff_best.pth.tar?dl=0
Thanks you so much..!
On Sun, 4 Aug 2019 at 04:23, Iman Mirzadeh notifications@github.com wrote:
Closed #7 https://github.com/imirzadeh/Teacher-Assistant-Knowledge-Distillation/issues/7 .
— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/imirzadeh/Teacher-Assistant-Knowledge-Distillation/issues/7?email_source=notifications&email_token=AMF5TEXE4NIHOEED7CKVS33QCXLLJA5CNFSM4IHQOUXKYY3PNVWWK3TUL52HS4DFWZEXG43VMVCXMZLOORHG65DJMZUWGYLUNFXW5KTDN5WW2ZLOORPWSZGOS3RNSIA#event-2531449120, or mute the thread https://github.com/notifications/unsubscribe-auth/AMF5TEQYV6JZC5Y4Y7ZP7J3QCXLLJANCNFSM4IHQOUXA .
Hello Imirzadeh,
Thanks for your help and quick response. Can you share the weights of table 3?
Thanks, Teerath Kumar
On Sun, 4 Aug 2019 at 23:21, teerath kumar teerathkumar142@gmail.com wrote:
Thanks you so much..!
On Sun, 4 Aug 2019 at 04:23, Iman Mirzadeh notifications@github.com wrote:
Closed #7 https://github.com/imirzadeh/Teacher-Assistant-Knowledge-Distillation/issues/7 .
— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/imirzadeh/Teacher-Assistant-Knowledge-Distillation/issues/7?email_source=notifications&email_token=AMF5TEXE4NIHOEED7CKVS33QCXLLJA5CNFSM4IHQOUXKYY3PNVWWK3TUL52HS4DFWZEXG43VMVCXMZLOORHG65DJMZUWGYLUNFXW5KTDN5WW2ZLOORPWSZGOS3RNSIA#event-2531449120, or mute the thread https://github.com/notifications/unsubscribe-auth/AMF5TEQYV6JZC5Y4Y7ZP7J3QCXLLJANCNFSM4IHQOUXA .
Secondly, I can not extract weight files. Please send me .zip instead of .rar It is showing me error of "operation break".
Thanks,
On Tue, 6 Aug 2019 at 09:16, teerath kumar teerathkumar142@gmail.com wrote:
Hello Imirzadeh,
Thanks for your help and quick response. Can you share the weights of table 3?
Thanks, Teerath Kumar
On Sun, 4 Aug 2019 at 23:21, teerath kumar teerathkumar142@gmail.com wrote:
Thanks you so much..!
On Sun, 4 Aug 2019 at 04:23, Iman Mirzadeh notifications@github.com wrote:
Closed #7 https://github.com/imirzadeh/Teacher-Assistant-Knowledge-Distillation/issues/7 .
— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/imirzadeh/Teacher-Assistant-Knowledge-Distillation/issues/7?email_source=notifications&email_token=AMF5TEXE4NIHOEED7CKVS33QCXLLJA5CNFSM4IHQOUXKYY3PNVWWK3TUL52HS4DFWZEXG43VMVCXMZLOORHG65DJMZUWGYLUNFXW5KTDN5WW2ZLOORPWSZGOS3RNSIA#event-2531449120, or mute the thread https://github.com/notifications/unsubscribe-auth/AMF5TEQYV6JZC5Y4Y7ZP7J3QCXLLJANCNFSM4IHQOUXA .
Ok.. Solved.. Thanks ..
On Tue, 6 Aug 2019 at 09:28, teerath kumar teerathkumar142@gmail.com wrote:
Secondly, I can not extract weight files. Please send me .zip instead of .rar It is showing me error of "operation break".
Thanks,
On Tue, 6 Aug 2019 at 09:16, teerath kumar teerathkumar142@gmail.com wrote:
Hello Imirzadeh,
Thanks for your help and quick response. Can you share the weights of table 3?
Thanks, Teerath Kumar
On Sun, 4 Aug 2019 at 23:21, teerath kumar teerathkumar142@gmail.com wrote:
Thanks you so much..!
On Sun, 4 Aug 2019 at 04:23, Iman Mirzadeh notifications@github.com wrote:
Closed #7 https://github.com/imirzadeh/Teacher-Assistant-Knowledge-Distillation/issues/7 .
— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/imirzadeh/Teacher-Assistant-Knowledge-Distillation/issues/7?email_source=notifications&email_token=AMF5TEXE4NIHOEED7CKVS33QCXLLJA5CNFSM4IHQOUXKYY3PNVWWK3TUL52HS4DFWZEXG43VMVCXMZLOORHG65DJMZUWGYLUNFXW5KTDN5WW2ZLOORPWSZGOS3RNSIA#event-2531449120, or mute the thread https://github.com/notifications/unsubscribe-auth/AMF5TEQYV6JZC5Y4Y7ZP7J3QCXLLJANCNFSM4IHQOUXA .
Hello Imirzadeh,
Thanks for your help and quick response. Can you share the weights of table 3?
Thanks
On Tue, 6 Aug 2019 at 11:35, teerath kumar teerathkumar142@gmail.com wrote:
Ok.. Solved.. Thanks ..
On Tue, 6 Aug 2019 at 09:28, teerath kumar teerathkumar142@gmail.com wrote:
Secondly, I can not extract weight files. Please send me .zip instead of .rar It is showing me error of "operation break".
Thanks,
On Tue, 6 Aug 2019 at 09:16, teerath kumar teerathkumar142@gmail.com wrote:
Hello Imirzadeh,
Thanks for your help and quick response. Can you share the weights of table 3?
Thanks, Teerath Kumar
On Sun, 4 Aug 2019 at 23:21, teerath kumar teerathkumar142@gmail.com wrote:
Thanks you so much..!
On Sun, 4 Aug 2019 at 04:23, Iman Mirzadeh notifications@github.com wrote:
Closed #7 https://github.com/imirzadeh/Teacher-Assistant-Knowledge-Distillation/issues/7 .
— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/imirzadeh/Teacher-Assistant-Knowledge-Distillation/issues/7?email_source=notifications&email_token=AMF5TEXE4NIHOEED7CKVS33QCXLLJA5CNFSM4IHQOUXKYY3PNVWWK3TUL52HS4DFWZEXG43VMVCXMZLOORHG65DJMZUWGYLUNFXW5KTDN5WW2ZLOORPWSZGOS3RNSIA#event-2531449120, or mute the thread https://github.com/notifications/unsubscribe-auth/AMF5TEQYV6JZC5Y4Y7ZP7J3QCXLLJANCNFSM4IHQOUXA .
Can you tell me, optimized value of T_student,lambda_student, seed?
Because I am facing issue while running nnctl command?
Thanks