sktime / pytorch-forecasting

Time series forecasting with PyTorch
https://pytorch-forecasting.readthedocs.io/
MIT License
3.99k stars 631 forks source link

Cannot create a consistent method resolution order (MRO) for bases Callback, PyTorchLightningPruningCallback #1468

Open ZhangYH2020 opened 11 months ago

ZhangYH2020 commented 11 months ago

I want to import optimize_hyperparameters using: from pytorch_forecasting.models.temporal_fusion_transformer.tuning import optimize_hyperparameters.

But it gets an error: Cannot create a consistent method resolution order (MRO) for bases Callback, PyTorchLightningPruningCallback

sunjin7725 commented 11 months ago

Same problem in google colab! Did you solved @ZhangYH2020 ?

Logfather commented 11 months ago

Today I had the same issue.

kkckk1110 commented 10 months ago

I have the same error. Did you fix it? @ZhangYH2020

Logfather commented 10 months ago

Hi Zhang, sorry no fix yet

ZhangYH2020 commented 10 months ago

Same problem in google colab! Did you solved @ZhangYH2020 ? No, finally I delete this code, and change hyperparameters manually.

ZhangYH2020 commented 10 months ago

I have the same error. Did you fix it? @ZhangYH2020

No, finally I delete this code, and change hyperparameters manually.

ruuttt commented 10 months ago

For me it worked to downgrade to optuna version 3.4 and pytorch 2.0.1 pip install torch==2.0.1 pytorch-lightning==2.0.2 pytorch_forecasting==1.0.0 torchaudio==2.0.2 torchdata==0.6.1 torchtext==0.15.2 torchvision==0.15.2 optuna==3.4

AjinkyaBankar commented 10 months ago

It worked for me after downgrading optuna==3.4.0 and torch==2.1.0.

XinyuWuu commented 7 months ago

import below gave me the same error:

from pytorch_forecasting.models.temporal_fusion_transformer.tuning import (
    optimize_hyperparameters,
)

packages (Python 3.10.14 (main, Mar 21 2024, 16:24:04) [GCC 11.2.0] on linux):

optuna                       3.6.0
optuna-integration           3.6.0
pytorch-forecasting          1.0.0
pytorch-lightning            2.2.1
pytorch_optimizer            2.12.0
torch                        2.2.1
narencastellon commented 4 months ago

Hello!!. I also have the same problem... it still hasn't been resolved... or what would be a way to solve it!!!

ari62 commented 4 months ago

I didn't need to downgrade torch, downgrading optuna worked for me: pip install optuna==3.4.0 my torch is at 2.3.1

aleksmaksimovic commented 3 months ago

I didn't need to downgrade torch, downgrading optuna worked for me: pip install optuna==3.4.0 my torch is at 2.3.1

Thank you, that worked for me!