jrzaurin / pytorch-widedeep

A flexible package for multimodal-deep-learning to combine tabular data with text and images using Wide and Deep models in Pytorch
Apache License 2.0
1.3k stars 190 forks source link

ImportError: cannot import name 'LRScheduler' from 'torch.optim.lr_scheduler' #192

Closed Nineves closed 1 year ago

Nineves commented 1 year ago
File [c:\Users\evely\anaconda3\envs\viper\lib\site-packages\pytorch_widedeep\wdtypes.py:61](file:///C:/Users/evely/anaconda3/envs/viper/lib/site-packages/pytorch_widedeep/wdtypes.py:61)
     23 from torchvision.transforms import (
     24     Pad,
     25     Lambda,
   (...)
     58     RandomAdjustSharpness,
     59 )
     60 from torchvision.models._api import WeightsEnum
---> 61 from torch.optim.lr_scheduler import LRScheduler
     62 from torch.utils.data.dataloader import DataLoader
     64 from pytorch_widedeep.models import (
     65     SAINT,
     66     TabMlp,
   (...)
     78     ContextAttentionMLP,
     79 )

After I changed from torch.optim.lr_scheduler import LRScheduler to from torch.optim.lr_scheduler import _LRScheduler as LRScheduler, the error disappeared.

jrzaurin commented 1 year ago

Which version of pytorch and our library are you using?

Nineves commented 1 year ago

Which version of pytorch and our library are you using? Pytorch version: 1.13.1 Pytorch-widedeep version: 1.3.2

jrzaurin commented 1 year ago

Yeah I thought so :)

Could you please update to PyTorch version >= 2? that will solve the problem

LuoXueling commented 1 year ago

I have also encountered this issue because requirements.txt does not restrict torch>=2.0 (which is quite strict). Should it be restricted?

An alternative could be a global modification in __init__ (which I have used for compatibility and it works well for torch==1.13.x).

from packaging import version

if version.parse(torch.__version__) < version.parse("2.0.0"):
    from torch.optim.lr_scheduler import _LRScheduler

    torch.optim.lr_scheduler.LRScheduler = _LRScheduler
jrzaurin commented 1 year ago

My understanding was that this is fixed internally in Pytorch, as it should be fully backwards compatible. In fact I believe there is a line in the library where they address this _LRScheduler vs LRScheduler "thing"

Let me check and otherwise, open an PR with that change and I will review and merge it

and thanks for reopening!

LuoXueling commented 1 year ago

I am not confident about the API change (and its reason), so I think it's better to leave the decision up to you @jrzaurin. By the way, a PR (https://github.com/pytorch/ignite/pull/2780) of pytorch/ignite uses a different way from mine to import the class

jrzaurin commented 1 year ago

So basically they do:

try:
    from torch.optim.lr_scheduler import LRScheduler as PyTorchLRScheduler
except ImportError:
    from torch.optim.lr_scheduler import _LRScheduler as PyTorchLRScheduler

and I guess we could do something like:

try:
    from torch.optim.lr_scheduler import LRScheduler
except ImportError:
    from torch.optim.lr_scheduler import _LRScheduler as LRScheduler

I will add it and test it with older pytorch version asap

jrzaurin commented 1 year ago

hi @LuoXueling and @Nineves

Due to a series of aspects while testing (like the fact that newer versions of pytorch require different nvidia drivers) and the fact that, in pple, pytorch 2.0+ ensures full backwards compatibility, I am going to restrict the torch version to be equal or greater than 2.0 in the next release.

I hope this is not a major drawback and sorry if is an inconvenient.