mboudiaf / pytorch-meta-dataset

A non-official 100% PyTorch implementation of META-DATASET benchmark for few-shot classification
59 stars 9 forks source link

Meta-batch size hard coded to 1 #8

Closed sudarshan1994 closed 3 years ago

sudarshan1994 commented 3 years ago

Hi, thank you for your implementation.

The meta-training dataloader seems to have a batch size 1 hard coded in it. I would like to train MAML on this and the default meta batch size there is 4. So I would like to know if there is any particular reason as to why the meta batch size is hard coded to 1.

Thank you!

mboudiaf commented 3 years ago

Hi,

Happy it can help :) There is no reason at all for the batch size. The script is just provided as an example for you to see how to use the loader. The specific hyperparameters are for you to set.

sudarshan1994 commented 3 years ago

Gotcha thanks!