Closed carbocation closed 7 months ago
If you are in a hurry, you might need to rewrite the following lines with IS_AMP_AVAILABLE = False
https://github.com/davidtvs/pytorch-lr-finder/blob/acc5e7ee7711a460bf3e1cc5c5f05575ba1e1b4b/torch_lr_finder/lr_finder.py#L14-L19
I'll summit a PR in next few days, because I need to take some items into consideration, including the earlier issue #67 in which you also commented. Sorry for the inactivity and inconvenience.
Hi @carbocation. Just come up with a question, are you trying to use LRFinder
with AMP disabled while wrapping the model and optimizer with apex.amp.initialize()
?
I'm curious about this use case. Because apex.amp.scale_loss()
should do nothing if the optimizer is not wrapped by apex.amp.initialize()
(see also code below).
https://github.com/davidtvs/pytorch-lr-finder/blob/acc5e7ee7711a460bf3e1cc5c5f05575ba1e1b4b/torch_lr_finder/lr_finder.py#L383-L387
Anyway, I'm working on integrating torch.amp
and apex.amp
, so that we can determine which one to use by explicitly specifying the backend. And thanks for bringing this issue up.
Apologies, I might have misunderstood the code. I am not wrapping my model or optimizer with any apex.amp
initializer. (I use the built-in pytorch AMP sometimes, but that does not require wrapping the model.) So perhaps I interpreted this fully backwards: I thought AMP was always enabled by default and I was hoping for a way to disable that, but perhaps it was never enabled for me since I don't use apex.
It's okay, hope this helps clarify something if you ran into problems while using AMP.
Closing, I believe #91 addressed this issue
I'd like to be able to use this without AMP, even when AMP is available. From what I can tell, if AMP is available this library uses it. It would be nice to have an override in the constructor where I can choose to disable AMP even if AMP is available on my system.