yaoyao-liu / meta-transfer-learning

TensorFlow and PyTorch implementation of "Meta-Transfer Learning for Few-Shot Learning" (CVPR2019)
https://lyy.mpi-inf.mpg.de/mtl/
MIT License
736 stars 149 forks source link

question about dataset #53

Open changykang opened 3 years ago

changykang commented 3 years ago

hi,i download the dataset from the link, but they donot include lable files, can you help me solve this problem. thank you.

yaoyao-liu commented 3 years ago

Thanks for your interest in our work.

You don't need the label files. The images for one class are saved in a corresponding folder. You may use the provided dataloader to generate episodes for few-shot learning.

changykang commented 3 years ago

thank you solve my problem,and can i directly run the run_experiment.py in tensorflow?

yaoyao-liu commented 3 years ago

Yes. After you install all the requirements and put the data in the proper directory, you may run the command line following the README.md file.

changykang commented 3 years ago

ok, thank you reply, when i install tensorflow-gpu=1.3.0 under the python=2.7, it shown packagenotfounderror, can you help me.

yaoyao-liu commented 3 years ago

If the packages can not be directly downloaded, you may build TensorFlow 1.3.0 according to this link.

changykang commented 3 years ago

Thank you for solving my problem patiently

changykang commented 3 years ago

i have another question about the dataset, what is the dataset in the meta-train and meta-test, and how did you split the support set and query set in the miniImageNet dataset? thank you

yaoyao-liu commented 3 years ago

The splits of miniImageNet are available here. There are two different splits for miniImageNet. Most papers follow the split provided by Vinyals et al.

During the meta-training phase, the support and query sets of each episode/task are randomly sampled from the training set of miniImageNet. During the meta-test phase, the support and query sets of each episode/task are randomly sampled from the test set of miniImageNet. You may easily find the generation protocol in any few-shot learning paper, e.g., MAML.

changykang commented 3 years ago

That is, in the large-scale phase, 64 classes are used for training; the meta-training phase is randomly sampled; the meta-teste stage uses the test data of miniimagenet. Am I right?

yaoyao-liu commented 3 years ago

During the pre-train phase, we train the encoder on a 64-way classification task using the meta-train set.

During the meta-train phase, we train the meta model on many 5-way small classification tasks generated from the meta-train set.

During the meta-test phase, we evaluate the model on many 5-way small classification tasks generated from the meta-test set.