Sha-Lab / FEAT

The code repository for "Few-Shot Learning via Embedding Adaptation with Set-to-Set Functions"
MIT License
418 stars 84 forks source link

pretraining hyper-parameters for tiered imagenet #47

Closed Tsingularity closed 3 years ago

Tsingularity commented 3 years ago

I went through all the issue posts but didn't find the pretraining hyper-parameters for tiered imagenet. I can only find mini-imagenet's. Could you also release them for tiered imagenet?

Thanks!

Han-Jia commented 3 years ago

Hi,

I use the following command to pretrain on the tieredimagenet: python3.7 pretrain.py --lr 0.1 --batch_size 128 --max_epoch 600 --backbone_class Res12 --schedule 400 500 550 580 --ngpu 1 --gamma 0.1 --dataset TieredImagenet --query 10