kjunelee / MetaOptNet

Meta-Learning with Differentiable Convex Optimization (CVPR 2019 Oral)
Apache License 2.0
529 stars 97 forks source link

The results on miniImageNet #22

Closed AceChuse closed 5 years ago

AceChuse commented 5 years ago

Thanks for your code! It is really good job.

I have tried almost experiments in paper, and most of them have got the accuracy reported in the paper. Except the results on miniImageNet.

The results of MetaOptNet-RR and MetaOptNet-SVM with and without label smoothing are close to 60.57 ± 0.44 (1shot) and 77.44 ± 0.33 (5shot).

Embedding: ResNet-12 without a global average pooling after the last residual block. Drop_rate: 0.1, use DropBlock in last two resnet blocks and dropblock size is 5. training shot: 15 training query: 6 nesterov momentum: 0.9 weight decay: 0.0005 mini-batch: 8 each epoch: 1000 episodes. (Here is 8000 tasks for each epoch, right?) initial learning rate: 0.1 learing rate: changed to 0.006, 0.0012, and 0.00024 at epochs 20, 40 and 50, respectively. label smoothing: 0.1 C of SVM: 0.1 Regularization of Ridge regression: 50.0 Iteration of QP solver: 15 (training), 3 (testing).

ProtoNets with Resnet-12 can get a result closed that in your paper, if we don't use label smooth. Is there something that I miss in MetaOptNet-RR and MetaOptNet-SVM? And is label smoothing used in MetaOptNet-RR, MetaOptNet-SVM and ProtoNets?

kjunelee commented 5 years ago

Thanks for your interest in our work!

Your hyperparmeters look okay to me. I believe label smoothing was applied to all miniImageNet experiments with ResNet12. As mentioned in #8, each meta-training run can result in slightly different result. I experienced similar issues with many other few-shot learning algorithms. Also, I guess the versions of packages should matter. At least in my environment, I never had <61% accuracy on MetaOptNet-SVM when label smoothing is applied.

AceChuse commented 5 years ago

I see. Thank you for your responce!

---Original--- From: "Kwonjoon Lee"<notifications@github.com> Date: Sun, Sep 15, 2019 13:45 PM To: "kjunelee/MetaOptNet"<MetaOptNet@noreply.github.com>; Cc: "Author"<author@noreply.github.com>;"AceChuse"<1825758650@qq.com>; Subject: Re: [kjunelee/MetaOptNet] The results on miniImageNet (#22)

Thanks for your interest in our work!

Your hyperparmeters look okay to me. I believe label smoothing was applied to all miniImageNet experiments with ResNet12. As mentioned in #8, each meta-training run can result in slightly different result. I experienced similar issues with many other few-shot learning algorithms. Also, I guess the versions of packages should matter. At least in my environment, I never had <61% accuracy on MetaOptNet-SVM when label smoothing is applied.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.