AntreasAntoniou / HowToTrainYourMAMLPytorch

The original code for the paper "How to train your MAML" along with a replication of the original "Model Agnostic Meta Learning" (MAML) paper in Pytorch.
https://arxiv.org/abs/1810.09502
Other
759 stars 137 forks source link

regarding google colab #15

Closed vainaixr closed 5 years ago

vainaixr commented 5 years ago

hello,

I made google colab file, I did the following -

1) ngrok visualization, SummaryWriter, I added two scalars, best validation accuracy, and epoch summary loss.

2) in google colab, files do not stay, if we want to use it for long time, so we have to save our files in google drive, and then retrieve them from there, so I added how to move files from google colab to google drive, and how to move files from google drive to google colab.

3) I have only run experiment for one omniglot json file, by changing the default in the argument parser, so for other datasets, need to make more modification, also, we need to add cross domain scenario, that is meta training on omniglot and meta testing on another dataset, for example MNIST. for example, in this repository, https://github.com/google-research/meta-dataset, they do cross domain meta learning,

4) I added comment for TPU imports, but not using TPU currently, we need to figure out how to use TPU to accelerate training.

I added imports from one of my saved snippet, and I replaced transforms. to ''.

https://colab.research.google.com/drive/1szW-I-EkjHHI6aDoNZYaAMtrdjsjUb2w#scrollTo=CXSGKa4l-RKj&uniqifier=2

please have a look at it.

also, I had doubt how to use meta learning for few shot segmentation, as maml is model agnostic, so combining it with mask rcnn, we need to figure it out.