Closed Nineves closed 1 year ago
Which version of pytorch and our library are you using?
Which version of pytorch and our library are you using? Pytorch version: 1.13.1 Pytorch-widedeep version: 1.3.2
Yeah I thought so :)
Could you please update to PyTorch version >= 2? that will solve the problem
I have also encountered this issue because requirements.txt does not restrict torch>=2.0 (which is quite strict). Should it be restricted?
An alternative could be a global modification in __init__
(which I have used for compatibility and it works well for torch==1.13.x).
from packaging import version
if version.parse(torch.__version__) < version.parse("2.0.0"):
from torch.optim.lr_scheduler import _LRScheduler
torch.optim.lr_scheduler.LRScheduler = _LRScheduler
My understanding was that this is fixed internally in Pytorch, as it should be fully backwards compatible. In fact I believe there is a line in the library where they address this _LRScheduler
vs LRScheduler
"thing"
Let me check and otherwise, open an PR with that change and I will review and merge it
and thanks for reopening!
I am not confident about the API change (and its reason), so I think it's better to leave the decision up to you @jrzaurin. By the way, a PR (https://github.com/pytorch/ignite/pull/2780) of pytorch/ignite uses a different way from mine to import the class
So basically they do:
try:
from torch.optim.lr_scheduler import LRScheduler as PyTorchLRScheduler
except ImportError:
from torch.optim.lr_scheduler import _LRScheduler as PyTorchLRScheduler
and I guess we could do something like:
try:
from torch.optim.lr_scheduler import LRScheduler
except ImportError:
from torch.optim.lr_scheduler import _LRScheduler as LRScheduler
I will add it and test it with older pytorch version asap
hi @LuoXueling and @Nineves
Due to a series of aspects while testing (like the fact that newer versions of pytorch require different nvidia drivers) and the fact that, in pple, pytorch 2.0+ ensures full backwards compatibility, I am going to restrict the torch version to be equal or greater than 2.0 in the next release.
I hope this is not a major drawback and sorry if is an inconvenient.
After I changed
from torch.optim.lr_scheduler import LRScheduler
tofrom torch.optim.lr_scheduler import _LRScheduler as LRScheduler
, the error disappeared.