sgugger / Adam-experiments

Experiments with Adam/AdamW/amsgrad
194 stars 36 forks source link

Amsgrad not implemented in fastai master #1

Closed adambielski closed 6 years ago

adambielski commented 6 years ago

Just cloned and installed fastai library. Where is the version with amsgrad?


  File "fit_stanford_cars.py", line 71, in <module>
    if __name__ == '__main__': fire.Fire(train_lm)
  File "/miniconda/envs/fastai/lib/python3.6/site-packages/fire/core.py", line 127, in Fire
    component_trace = _Fire(component, args, context, name)
  File "/miniconda/envs/fastai/lib/python3.6/site-packages/fire/core.py", line 366, in _Fire
    component, remaining_args)
  File "/miniconda/envs/fastai/lib/python3.6/site-packages/fire/core.py", line 542, in _CallCallable
    result = fn(*varargs, **kwargs)
  File "fit_stanford_cars.py", line 69, in train_lm
    main_train(lr, moms, wd, wd_loss, opt_fn, bs, cyc_len, beta2, amsgrad, div, pct, lin_end, tta, div_lr, fname)
  File "fit_stanford_cars.py", line 37, in main_train
    learn.fit_opt_sched(phases, callbacks=[LogResults(learn, fname)])
  File "/home/abielski/repo/fastai/fastai/learner.py", line 427, in fit_opt_sched
    layer_opt = LayerOptimizer(phases[0].opt_fn, self.get_layer_groups(), 1e-2, phases[0].wds)
  File "/home/abielski/repo/fastai/fastai/layer_optimizer.py", line 15, in __init__
    self.opt = opt_fn(self.opt_params())
TypeError: __init__() got an unexpected keyword argument 'amsgrad'```
sgugger commented 6 years ago

Ah, you may need to update pytorch to 0.4.0 to get amsgrad, I'll add a note in the README.md