unit8co / darts

A python library for user-friendly forecasting and anomaly detection on time series.
https://unit8co.github.io/darts/
Apache License 2.0
7.91k stars 858 forks source link

AttributeError : PastCyclicEncoder' object has no attribute 'tz' #2144

Closed hberande closed 8 months ago

hberande commented 8 months ago

I am trying to load the saved model in latest Darts version 0.27.1 which was trained with version = 0.25.0 or 0.26.0 and getting following error. Please help me to resolve the same.

AttributeError                            Traceback (most recent call last)
Cell In[17], line 180
    176 cov_scaled    = scaler_WS.fit_transform(WS_series)       
    178 ###########################################(Forecasting) #########################################
--> 180 pred          = model_nbeats.predict(series = series_scaled, n = 18, past_covariates = cov_scaled)
    181 pred          = scaler_power.inverse_transform(pred)
    182 df_pred       = pred.pd_dataframe()

File ~\AppData\Local\anaconda3\envs\Darts\Lib\site-packages\darts\utils\torch.py:112, in random_method.<locals>.decorator(self, *args, **kwargs)
    110 with fork_rng():
    111     manual_seed(self._random_instance.randint(0, high=MAX_TORCH_SEED_VALUE))
--> 112     return decorated(self, *args, **kwargs)

File ~\AppData\Local\anaconda3\envs\Darts\Lib\site-packages\darts\models\forecasting\torch_forecasting_model.py:1344, in TorchForecastingModel.predict(self, n, series, past_covariates, future_covariates, trainer, batch_size, verbose, n_jobs, roll_size, num_samples, num_loader_workers, mc_dropout, predict_likelihood_parameters, show_warnings)
   1340 # encoders are set when calling fit(), but not when calling fit_from_dataset()
   1341 # when covariates are loaded from model, they already contain the encodings: this is not a problem as the
   1342 # encoders regenerate the encodings
   1343 if self.encoders is not None and self.encoders.encoding_available:
-> 1344     past_covariates, future_covariates = self.generate_predict_encodings(
   1345         n=n,
   1346         series=series,
   1347         past_covariates=past_covariates,
   1348         future_covariates=future_covariates,
   1349     )
   1350 super().predict(
   1351     n,
   1352     series,
   (...)
   1357     show_warnings=show_warnings,
   1358 )
   1360 dataset = self._build_inference_dataset(
   1361     target=series,
   1362     n=n,
   (...)
   1366     bounds=None,
   1367 )

File ~\AppData\Local\anaconda3\envs\Darts\Lib\site-packages\darts\models\forecasting\forecasting_model.py:1778, in ForecastingModel.generate_predict_encodings(self, n, series, past_covariates, future_covariates)
   1748 """Generates covariate encodings for the inference/prediction set and returns a tuple of past, and future
   1749 covariates series with the original and encoded covariates stacked together. The encodings are generated by the
   1750 encoders defined at model creation with parameter `add_encoders`. Pass the same `series`, `past_covariates`,
   (...)
   1770     encoded covariates.
   1771 """
   1772 raise_if(
   1773     self.encoders is None or not self.encoders.encoding_available,
   1774     "Encodings are not available. Consider adding parameter `add_encoders` at model creation and fitting the "
   1775     "model with `model.fit()` before.",
   1776     logger=logger,
   1777 )
-> 1778 return self.encoders.encode_inference(
   1779     n=n,
   1780     target=series,
   1781     past_covariates=past_covariates,
   1782     future_covariates=future_covariates,
   1783 )

File ~\AppData\Local\anaconda3\envs\Darts\Lib\site-packages\darts\dataprocessing\encoders\encoders.py:1073, in SequentialEncoder.encode_inference(self, n, target, past_covariates, future_covariates, encode_past, encode_future)
   1042 """Returns encoded index for all past and/or future covariates for inference/prediction.
   1043 Which covariates are generated depends on the parameters used at model creation.
   1044 
   (...)
   1065     for the {x}_covariates.
   1066 """
   1067 raise_if(
   1068     not self.fit_called and self.requires_fit,
   1069     f"`{self.__class__.__name__}` contains encoders or transformers which must be trained before inference. "
   1070     "Call method `encode_train()` before `encode_inference()`.",
   1071     logger=logger,
   1072 )
-> 1073 return self._launch_encoder(
   1074     target=target,
   1075     past_covariates=past_covariates,
   1076     future_covariates=future_covariates,
   1077     encoder_method=_EncoderMethod("inference"),
   1078     n=n,
   1079     encode_past=encode_past,
   1080     encode_future=encode_future,
   1081 )

File ~\AppData\Local\anaconda3\envs\Darts\Lib\site-packages\darts\dataprocessing\encoders\encoders.py:1165, in SequentialEncoder._launch_encoder(self, target, past_covariates, future_covariates, encoder_method, n, encode_past, encode_future)
   1163 # generate past covariates encodings
   1164 if self.past_encoders and encode_past:
-> 1165     past_covariates = self._encode_sequence(
   1166         encoders=self.past_encoders,
   1167         transformer=self.past_transformer,
   1168         target=target,
   1169         covariates=past_covariates,
   1170         covariates_type=PAST,
   1171         encoder_method=encoder_method,
   1172         n=n,
   1173     )
   1175 # generate future covariates encodings
   1176 if self.future_encoders and encode_future:

File ~\AppData\Local\anaconda3\envs\Darts\Lib\site-packages\darts\dataprocessing\encoders\encoders.py:1223, in SequentialEncoder._encode_sequence(self, encoders, transformer, target, covariates, covariates_type, encoder_method, n)
   1216 for ts, covs in zip(target, covariates):
   1217     # drop encoder components if they are in input covariates
   1218     covs = self._drop_encoded_components(
   1219         covariates=covs,
   1220         components=getattr(self, f"{covariates_type}_components"),
   1221     )
   1222     encoded = concatenate(
-> 1223         [
   1224             getattr(enc, encode_method)(
   1225                 target=ts, covariates=covs, merge_covariates=False, n=n
   1226             )
   1227             for enc in encoders
   1228         ],
   1229         axis=DIMS[1],
   1230     )
   1231     encoded_sequence.append(
   1232         self._merge_covariates(encoded=encoded, covariates=covs)
   1233     )
   1235 if transformer is not None:

File ~\AppData\Local\anaconda3\envs\Darts\Lib\site-packages\darts\dataprocessing\encoders\encoders.py:1224, in <listcomp>(.0)
   1216 for ts, covs in zip(target, covariates):
   1217     # drop encoder components if they are in input covariates
   1218     covs = self._drop_encoded_components(
   1219         covariates=covs,
   1220         components=getattr(self, f"{covariates_type}_components"),
   1221     )
   1222     encoded = concatenate(
   1223         [
-> 1224             getattr(enc, encode_method)(
   1225                 target=ts, covariates=covs, merge_covariates=False, n=n
   1226             )
   1227             for enc in encoders
   1228         ],
   1229         axis=DIMS[1],
   1230     )
   1231     encoded_sequence.append(
   1232         self._merge_covariates(encoded=encoded, covariates=covs)
   1233     )
   1235 if transformer is not None:

File ~\AppData\Local\anaconda3\envs\Darts\Lib\site-packages\darts\dataprocessing\encoders\encoder_base.py:739, in SingleEncoder.encode_inference(self, n, target, covariates, merge_covariates, **kwargs)
    735 # generate index and encodings
    736 index, target_end = self.index_generator.generate_inference_idx(
    737     n, target, covariates
    738 )
--> 739 encoded = self._encode(index, target_end, target.dtype)
    741 # optionally, merge encodings with original `covariates` series
    742 encoded = (
    743     self._merge_covariates(encoded, covariates=covariates)
    744     if merge_covariates
    745     else encoded
    746 )

File ~\AppData\Local\anaconda3\envs\Darts\Lib\site-packages\darts\dataprocessing\encoders\encoders.py:245, in CyclicTemporalEncoder._encode(self, index, target_end, dtype)
    234 """applies cyclic encoding from `datetime_attribute_timeseries()` to `self.attribute` of `index`."""
    235 super()._encode(index, target_end, dtype)
    236 return datetime_attribute_timeseries(
    237     index,
    238     attribute=self.attribute,
    239     cyclic=True,
    240     dtype=dtype,
    241     with_columns=[
    242         self.base_component_name + self.attribute + "_sin",
    243         self.base_component_name + self.attribute + "_cos",
    244     ],
--> 245     tz=self.tz,
    246 )

AttributeError: 'PastCyclicEncoder' object has no attribute 'tz'
dennisbader commented 8 months ago

Hi @hberande, we can't always guarantee backwards compatibility for stored models (see here).

You could try to load only the weights from your checkpoint or manual save:

For this to work, you need to first create a new model object with the hyperparmeters as the stored one so that the architecture can be reproduced. If you don't know them anymore, the error messages should help setting them properly.

# recreate model from scratch
model_new = NBEATSModel(
   ... # same parameters
)

# load the weights (or with `load_weights(...)`)
model_new.load_weights_from_checkpoint(...)

You might have to play around with the load_encoders parameter when calling the load_weights_*() methods.

Hope it helps.

dennisbader commented 8 months ago

Hi @hberande, did the proposed solution resolve your issue?