Nixtla / neuralforecast

Scalable and user friendly neural :brain: forecasting algorithms.
https://nixtlaverse.nixtla.io/neuralforecast
Apache License 2.0
2.98k stars 342 forks source link

TypeError: Trainer.__init__() got an unexpected keyword argument 'input_size_multiplier' #930

Closed pranavvp16 closed 6 months ago

pranavvp16 commented 6 months ago

What happened + What you expected to happen

I tried to update the default config for Auto model AutoLSTM as mentioned in issues #924, but the model fails to to train when called with model.fit. Seems the config is being passed to Trainer from pytorch lightning. I got this same error when I updated max_steps too, So I think updating any key from the default config and passing it to model doesn't work correctly. Here is notebook illustrating the issue.

Versions / Dependencies

neuralforecast == 1.6.4

Reproduction script

from neuralforecast.auto import AutoLSTM
from ray import tune
config  = AutoLSTM.default_config

config['learning_rate'] = tune.choice([1e-2, 1e-3])

model = AutoLSTM(num_samples = 10, h=1, config = config)

from neuralforecast.tsdataset import TimeSeriesDataset
from neuralforecast.utils import AirPassengersDF as Y_df

Y_train_df = Y_df[Y_df.ds<='1959-12-31'] # 132 train
Y_test_df = Y_df[Y_df.ds>'1959-12-31']   # 12 test
dataset, *_ = TimeSeriesDataset.from_df(Y_train_df)

model.fit(dataset=dataset)

Issue Severity

High: It blocks me from completing my task.

elephaint commented 6 months ago

Thanks for reporting - I think input_size_multiplier doesn't work in any case right now as a configurable hyperparameter so this is something that needs to be fixed too.

elephaint commented 6 months ago

You can fix this (monkey patch) by:

  1. Getting rid of input_size_multiplier in the config, i.e. del config['input_size_multiplier']
  2. Getting rid of inference_input_size_multiplier in the config, i.e. del config['inference_input_size_multiplier']
  3. Adding input_size and inference_input_size to the config with your values of choice.

It's a bit involved, unfortunately. The default_config takes the user-supplied values for the horizon and uses those to calculate default values for some parameters. It makes use of these *_multiplier params for that, but these are not params that the underlying model can handle. So we you extract the default_config, these *_multiplier params need to be removed from the config and replaced by their 'normal' counterparts. Not ideal, it's something we have to think about how/if to fix.

yarnabrina commented 6 months ago

I think this issue related to the other documentation issue I created in the morning: #929. That is blocking me for the Auto* adapter, and I guess this will block @pranavvp16 to inherit it for AutoLSTM interface in sktime.

Regarding the default config update, will it help if you have separate dataclass/pydantic.BaseModel etc. per model to validate and update hyperparameters?