titu1994 / keras-one-cycle

Implementation of One-Cycle Learning rate policy (adapted from Fast.ai lib)
MIT License
285 stars 78 forks source link

strange LR vs loss plot #4

Closed austinmw closed 5 years ago

austinmw commented 5 years ago

Hi, thanks for creating this repo! This is likely not a bug, but I'm curious if you would be able to interpret for me what this lr vs. loss plot means? It looks quite different from the plots I've gotten in the past from the Fastai library.

titu1994 commented 5 years ago

Probably a bug. What parameters were used in the callback ?

austinmw commented 5 years ago

Thanks for your reply. Here's my callback args:

LRFinder(num_samples=train_generator.n, batch_size=batch_size, minimum_lr=1e-5, maximum_lr=10., lr_scale='exp',

validation_data=None,

            validation_sample_rate=5,
            stopping_criterion_factor=4.,
            loss_smoothing_beta=0.98,
            save_dir=out_data_path,
            verbose=True)
titu1994 commented 5 years ago

Are you using the cyclic rate callback while you are finding the LR? They are two different steps.

From the plot it looks like the LR is beginning to increase and then decreases halfway through again.

austinmw commented 5 years ago

Ah, yes I had both callbacks on at the same time. I should turn the one cycle policy off while using the LR finder for a single epoch I think right?

titu1994 commented 5 years ago

Didn't get a notification for the last update. Yes you need to perform the LR finding stage first without the cyclic rate callback.