sktime / pytorch-forecasting

Time series forecasting with PyTorch
https://pytorch-forecasting.readthedocs.io/
MIT License
4.01k stars 635 forks source link

Is there a reason that pytorch-forecasting seemingly unnecessarily saves loss and logging_metrics multiple times? #1700

Open dr-upsilon opened 1 month ago

dr-upsilon commented 1 month ago

C:\...miniconda3\envs\envpt\Lib\site-packages\lightning\pytorch\utilities\parsing.py:208: Attribute 'logging_metrics' is an instance ofnn.Moduleand is already saved during checkpointing. It is recommended to ignore them usingself.save_hyperparameters(ignore=['logging_metrics']).

This is caused by self.save_hyperparameters() in init method of TemporalFusionTransformer, because save_hyperparameters() uses inspect and frame to identify all the hyperparameters,

What's the reason to keep it or shall we add handling in init?

fkiraly commented 1 month ago

Maybe a question for @jdb78.

@dr-upsilon, is this happening in a way that is duplicative with lightning core?

dr-upsilon commented 1 month ago

Maybe a question for @jdb78.

@dr-upsilon, is this happening in a way that is duplicative with lightning core?

Yes I think so. BaseModel inherits from HyperparametersMixin from lighting.pytorch.core.mixins.hparams_mixin.

fkiraly commented 2 weeks ago

Looked through this multiple times and I cannot come up with a good reason (without hearing @jdb78 original rationale).

I suppose it is something that changed between different versions of lightning, so it made sense once but is now duplicative.

Would you like to contribute a PR?