AntreasAntoniou / HowToTrainYourMAMLPytorch

The original code for the paper "How to train your MAML" along with a replication of the original "Model Agnostic Meta Learning" (MAML) paper in Pytorch.
https://arxiv.org/abs/1810.09502
Other
759 stars 137 forks source link

regarding google colab #13

Closed vainaixr closed 5 years ago

vainaixr commented 5 years ago

hello, is it possible to convert the code in jupyter notebook form, so that it can be run in google colab.

it would make it possible to run it without having a local GPU, and we can use google colab GPU.

thanks

AntreasAntoniou commented 5 years ago

Converting the running script into a jupyter notebook should be trivial. Unfortunately, I don't have any spare time at the moment to work on this. However, feel free to develop such a notebook and make a pull request to add it to this repo.

vainaixr commented 5 years ago

Hello,

is it possible to add visualization, to plot support set and query set images, (to tensorboard), it will make it easier to understand what images are being used in the support set and in the query set.

this visualization is also useful while combining maml with GANs, to visualize the input and output.

google colab has TPU support also, but the only example I could find, was training MNIST on xla repository, and it would be different for MAML++, can those TPUs be used to accelerate training.

also, I found only json files in the dataset folder, but it is written in the readme file that 'We provide the omniglot dataset in the datasets folder directly in this repo. ' and in the argument parser, the default directory is "datasets/omniglot", how to get the omniglot dataset?

thanks

AntreasAntoniou commented 5 years ago

As I previously said, you are welcome to implement those features and issue a PR.