AntreasAntoniou / HowToTrainYourMAMLPytorch

The original code for the paper "How to train your MAML" along with a replication of the original "Model Agnostic Meta Learning" (MAML) paper in Pytorch.
https://arxiv.org/abs/1810.09502
Other
773 stars 137 forks source link

Improved syntax and usability #21

Closed jaywonchung closed 2 years ago

jaywonchung commented 5 years ago

Hi,

I forked this repository and read it through, upon which I found numerous less-pythonic/inefficient pieces of code, less-informative object names, and unused arguments and variables. Examples include things like

if condition is True:
for k, v in zip(list(d.keys()), list(d.values())):
var = True if a==b else False
num_steps

which can (and better) be safely replaced with

if condition:
for k, v in d.items():
var = (a==b)
num_inner_steps

I have fixed this repository for my own research, but I figured others may benefit from this too.

Thus I have done the following:

  1. Fix less-pythonic syntax
  2. Change to more informative variable names
  3. Remove unused variables and arguments
  4. Add comments and docstrings
  5. Remov continue_from_epoch, evalute_on_test_set_only, and gpu_to_use from json config. These are better as commandline arguments.
  6. Test proper functionality of
    • experiment config/script generation
    • execution of experiment scripts, including training, saving, continuing, validation, and testing on top_n models

Functionality changes are as follows:

  1. Experiment results are now saved in the Experiments folder. This folder is not tracked by git.
  2. Python caches and mini-imagenet data are not tracked by git.
  3. Training announcements (print, tqdm, ...) are improved.

Happy research!

P.S. Big thanks to @AntreasAntoniou for the sharing such a great repository!