mgrankin / over9000

Over9000 optimizer
Apache License 2.0
425 stars 57 forks source link

Port to fastai v2? #19

Closed oguiza closed 4 years ago

oguiza commented 4 years ago

Hi Mikhail, i wanted to thank you for developing this repo. I've been using RangerLars for some time now, and have achieved pretty good results in my datasets. I wanted to ask you is you have any plans to port if to fastai v2. Thanks again

mgrankin commented 4 years ago

Hi, It doesn't have a fastai dependency, should work. Have you tried it with fastai v2?

oguiza commented 4 years ago

Thanks for your quick reply. Yes, I've tried it but unfortunately I get an error message:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-60-8f14bb6c31df> in <module>()
    149         else:
    150             print(f'training without warmup lr={max_lr}, lr decay start={pct_start}, wd={wd}\n')
--> 151             learn.fit_flat_cos(epochs, lr=max_lr, pct_start=pct_start, div_final=1e5, wd=wd)
    152 
    153         if i == 0:

1 frames
/usr/local/lib/python3.6/dist-packages/fastcore/utils.py in _f(*args, **kwargs)
    428         init_args.update(log)
    429         setattr(inst, 'init_args', init_args)
--> 430         return inst if to_return else f(*args, **kwargs)
    431     return _f
    432 

/usr/local/lib/python3.6/dist-packages/fastai2/callback/schedule.py in fit_flat_cos(self, n_epoch, lr, div_final, pct_start, wd, cbs, reset_opt)
    132     "Fit `self.model` for `n_epoch` at flat `lr` before a cosine annealing."
    133     if self.opt is None: self.create_opt()
--> 134     self.opt.set_hyper('lr', self.lr if lr is None else lr)
    135     lr = np.array([h['lr'] for h in self.opt.hypers])
    136     scheds = {'lr': combined_cos(pct_start, lr, lr, lr/div_final)}

AttributeError: 'Lookahead' object has no attribute 'set_hyper'

I've seen they've created a new Optimizer class and have created Lookahead, Lamb, RAdam, ranger, but not RangerLars.

mgrankin commented 4 years ago

I'm not sure it ever be included, but you can use their Wrapper to make it compatible

def opt(*args, **kwargs):
    return OptimWrapper(RangerLars(*args, **kwargs))
opt_func = partial(opt, eps=1e-5)
oguiza commented 4 years ago

Thanks again mgrankin. Unfortunately your solution doesn't work. I still get an error:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
/usr/local/lib/python3.6/dist-packages/fastai2/learner.py in fit(self, n_epoch, lr, wd, cbs, reset_opt)
    199                         self.epoch=epoch;          self('begin_epoch')
--> 200                         self._do_epoch_train()
    201                         self._do_epoch_validate()

24 frames
AttributeError: 'RALAMB' object has no attribute 'param_lists'

During handling of the above exception, another exception occurred:

IndexError                                Traceback (most recent call last)
/usr/local/lib/python3.6/dist-packages/fastcore/foundation.py in _get(self, i)
    329 
    330     def _get(self, i):
--> 331         if is_indexer(i) or isinstance(i,slice): return getattr(self.items,'iloc',self.items)[i]
    332         i = mask2idxs(i)
    333         return (self.items.iloc[list(i)] if hasattr(self.items,'iloc')

IndexError: list index out of range
mgrankin commented 4 years ago

Hm, I've tested the code with fastai2. Can you show me you code?

oguiza commented 4 years ago

Sorry I made a mistake. My bad. The solution you described works great!! Thank you very much. I'll close the issue. PS: as a suggestion, you may want to add this solution to the front page, as others may also use your optimizers with fastai v2.