timeseriesAI / tsai

Time series Timeseries Deep Learning Machine Learning Python Pytorch fastai | State-of-the-art Deep Learning library for Time Series and Sequences in Pytorch / fastai
https://timeseriesai.github.io/tsai/
Apache License 2.0
5.16k stars 644 forks source link

TSBERT not working with lr_find #48

Closed friedenbergd closed 3 years ago

friedenbergd commented 3 years ago

When running the TSBERT notebook I tried adding a lr_find. lr_find ran and the output looked reasonable, but the subsequent fit_one_cycle gave the error below.

learn = ts_learner(udls100, InceptionTimePlus, cbs=[ShowGraph(), TSBERT(target_dir='./data/TSBERT', fname=f'{dsid}')])
lr_min, lr_steep = learn.lr_find()
learn.fit_one_cycle(200, 1e-2)
ValueError                                Traceback (most recent call last)
<ipython-input-8-91d50d80d902> in <module>()
      2 learn = ts_learner(udls100, InceptionTimePlus, cbs=[ShowGraph(), TSBERT(target_dir='./data/TSBERT', fname=f'{dsid}')])
      3 lr_min, lr_steep = learn.lr_find()
----> 4 learn.fit_one_cycle(200, 1e-2)

11 frames
/usr/local/lib/python3.6/dist-packages/fastai/callback/schedule.py in fit_one_cycle(self, n_epoch, lr_max, div, div_final, pct_start, wd, moms, cbs, reset_opt)
    110     scheds = {'lr': combined_cos(pct_start, lr_max/div, lr_max, lr_max/div_final),
    111               'mom': combined_cos(pct_start, *(self.moms if moms is None else moms))}
--> 112     self.fit(n_epoch, cbs=ParamScheduler(scheds)+L(cbs), reset_opt=reset_opt, wd=wd)
    113 
    114 # Cell

/usr/local/lib/python3.6/dist-packages/fastai/learner.py in fit(self, n_epoch, lr, wd, cbs, reset_opt)
    209             self.opt.set_hypers(lr=self.lr if lr is None else lr)
    210             self.n_epoch = n_epoch
--> 211             self._with_events(self._do_fit, 'fit', CancelFitException, self._end_cleanup)
    212 
    213     def _end_cleanup(self): self.dl,self.xb,self.yb,self.pred,self.loss = None,(None,),(None,),None,None

/usr/local/lib/python3.6/dist-packages/fastai/learner.py in _with_events(self, f, event_type, ex, final)
    158 
    159     def _with_events(self, f, event_type, ex, final=noop):
--> 160         try: self(f'before_{event_type}');  f()
    161         except ex: self(f'after_cancel_{event_type}')
    162         self(f'after_{event_type}');  final()

/usr/local/lib/python3.6/dist-packages/fastai/learner.py in _do_fit(self)
    200         for epoch in range(self.n_epoch):
    201             self.epoch=epoch
--> 202             self._with_events(self._do_epoch, 'epoch', CancelEpochException)
    203 
    204     def fit(self, n_epoch, lr=None, wd=None, cbs=None, reset_opt=False):

/usr/local/lib/python3.6/dist-packages/fastai/learner.py in _with_events(self, f, event_type, ex, final)
    160         try: self(f'before_{event_type}');  f()
    161         except ex: self(f'after_cancel_{event_type}')
--> 162         self(f'after_{event_type}');  final()
    163 
    164     def all_batches(self):

/usr/local/lib/python3.6/dist-packages/fastai/learner.py in __call__(self, event_name)
    139 
    140     def ordered_cbs(self, event): return [cb for cb in self.cbs.sorted('order') if hasattr(cb, event)]
--> 141     def __call__(self, event_name): L(event_name).map(self._call_one)
    142 
    143     def _call_one(self, event_name):

/usr/local/lib/python3.6/dist-packages/fastcore/foundation.py in map(self, f, gen, *args, **kwargs)
    152     def range(cls, a, b=None, step=None): return cls(range_of(a, b=b, step=step))
    153 
--> 154     def map(self, f, *args, gen=False, **kwargs): return self._new(map_ex(self, f, *args, gen=gen, **kwargs))
    155     def argwhere(self, f, negate=False, **kwargs): return self._new(argwhere(self, f, negate, **kwargs))
    156     def filter(self, f=noop, negate=False, gen=False, **kwargs):

/usr/local/lib/python3.6/dist-packages/fastcore/basics.py in map_ex(iterable, f, gen, *args, **kwargs)
    664     res = map(g, iterable)
    665     if gen: return res
--> 666     return list(res)
    667 
    668 # Cell

/usr/local/lib/python3.6/dist-packages/fastcore/basics.py in __call__(self, *args, **kwargs)
    649             if isinstance(v,_Arg): kwargs[k] = args.pop(v.i)
    650         fargs = [args[x.i] if isinstance(x, _Arg) else x for x in self.pargs] + args[self.maxi+1:]
--> 651         return self.func(*fargs, **kwargs)
    652 
    653 # Cell

/usr/local/lib/python3.6/dist-packages/fastai/learner.py in _call_one(self, event_name)
    143     def _call_one(self, event_name):
    144         if not hasattr(event, event_name): raise Exception(f'missing {event_name}')
--> 145         for cb in self.cbs.sorted('order'): cb(event_name)
    146 
    147     def _bn_bias_state(self, with_bias): return norm_bias_params(self.model, with_bias).map(self.opt.state)

/usr/local/lib/python3.6/dist-packages/fastai/callback/core.py in __call__(self, event_name)
     42                (self.run_valid and not getattr(self, 'training', False)))
     43         res = None
---> 44         if self.run and _run: res = getattr(self, event_name, noop)()
     45         if event_name=='after_fit': self.run=True #Reset self.run to True at each end of fit
     46         return res

/usr/local/lib/python3.6/dist-packages/tsai/callback/core.py in after_epoch(self)
     69         val_losses = [v[1] for v in rec.values]
     70         x_bounds = (0, (self.n_epoch - len(self.nb_batches)) * self.nb_batches[0] + len(rec.losses))
---> 71         y_min = min((min(rec.losses), min(val_losses)))
     72         y_max = max((max(rec.losses), max(val_losses)))
     73         margin = (y_max - y_min) * .05

ValueError: min() arg is an empty sequence
oguiza commented 3 years ago

Hi @friedenbergd, thanks for using tsai, and for raising this issue. I've now fixed it in github. So if you are working with the bleeding version of tsai with

!pip install -Uqq git+https://github.com/timeseriesAI/tsai.git@master

you may re-run the code and it should work well. If you need to install it from pip, you'll need to wait until either later today or tomorrow when I plan to make a new release.

friedenbergd commented 3 years ago

Awesome, thanks! I'm really enjoying the library and learning a lot of new ideas!

dnth commented 3 years ago

lr_find doesnt seem to work for MVP with recent tsai version. However this time the error is

AttributeError: 'InceptionTimePlus' object has no attribute 'smooth_loss'

Versions

tsai       : 0.2.18
fastai     : 2.4.1
fastcore   : 1.3.20
torch      : 1.9.0+cu102

Link to colab

https://colab.research.google.com/drive/1lVgDLSv03YQuIdF7Q8SSaf5xRMlcdMeY?usp=sharing

oguiza commented 3 years ago

Hi, I think I've fixed the issue now. It'd be good if you could test it works well.

dnth commented 3 years ago

I confirm that it works now!! Thanks @oguiza !

oguiza commented 3 years ago

Thank you @dnth for creating this issue!