Closed austinmw closed 5 years ago
Probably a bug. What parameters were used in the callback ?
Thanks for your reply. Here's my callback args:
LRFinder(num_samples=train_generator.n, batch_size=batch_size, minimum_lr=1e-5, maximum_lr=10., lr_scale='exp',
validation_sample_rate=5,
stopping_criterion_factor=4.,
loss_smoothing_beta=0.98,
save_dir=out_data_path,
verbose=True)
Are you using the cyclic rate callback while you are finding the LR? They are two different steps.
From the plot it looks like the LR is beginning to increase and then decreases halfway through again.
Ah, yes I had both callbacks on at the same time. I should turn the one cycle policy off while using the LR finder for a single epoch I think right?
Didn't get a notification for the last update. Yes you need to perform the LR finding stage first without the cyclic rate callback.
Hi, thanks for creating this repo! This is likely not a bug, but I'm curious if you would be able to interpret for me what this lr vs. loss plot means? It looks quite different from the plots I've gotten in the past from the Fastai library.