haebeom-lee / l2b

Tensorflow implementation of "Learning to Balance: Bayesian Meta-learning for Imbalanced and Out-of-distribution Tasks" (ICLR 2020 oral)
98 stars 10 forks source link

The meaning of "--alpha_on --omega_on --gamma_on --z_on" in command line #1

Closed Hugo101 closed 4 years ago

Hugo101 commented 4 years ago

In the main.py, these four parameters are default False. So do we still need to add "--alpha_on --omega_on --gamma_on --z_on" in the command?

Also, I did not find where we use "--alpha_on, --z_on" in the main.py file.

When I follow the exact instruction of the experiment (CIFAR, SVHN), I have the following results: The results are similar to that of MAML, not better than Bayesian TAML. Do you have any ideas?

image

haebeom-lee commented 4 years ago

Yes, you should turn on the four variables for both meta-training and meta-testing. Please see the below instructions (you can also see it from README.md):

Please let me know if you have the same results after turning on all the variables correctly.

Also, the flag of those four variables are used in model.py (line 108,126,138,144).

Meta-training

$ python main.py \
  --gpu_id 0 \
  --savedir "./results/cifar/taml" --id_dataset 'cifar' --ood_dataset 'svhn' \
  --mode 'meta_train' --metabatch 4 --n_steps 5 --way 5 --max_shot 50 --query 15 \
  --n_train_iters 50000 --meta_lr 1e-3 \
  --alpha_on --omega_on --gamma_on --z_on

Meta-testing

$ python main.py \
  --gpu_id 0 \
  --savedir "./results/cifar/taml" --id_dataset 'cifar' --ood_dataset 'svhn' \
  --mode 'meta_test' --metabatch 4 --n_steps 10 --way 5 --max_shot 50 --query 15 \
  --n_test_episodes 1000 \
  --alpha_on --omega_on --gamma_on --z_on --n_mc_samples 10
Hugo101 commented 4 years ago

Thanks so much for your quick reply! It makes sense. Sorry for this so simple question!