Open dr-upsilon opened 1 month ago
Maybe a question for @jdb78.
@dr-upsilon, is this happening in a way that is duplicative with lightning
core?
Maybe a question for @jdb78.
@dr-upsilon, is this happening in a way that is duplicative with
lightning
core?
Yes I think so. BaseModel inherits from HyperparametersMixin from lighting.pytorch.core.mixins.hparams_mixin.
Looked through this multiple times and I cannot come up with a good reason (without hearing @jdb78 original rationale).
I suppose it is something that changed between different versions of lightning
, so it made sense once but is now duplicative.
Would you like to contribute a PR?
C:\...miniconda3\envs\envpt\Lib\site-packages\lightning\pytorch\utilities\parsing.py:208: Attribute 'logging_metrics' is an instance of
nn.Moduleand is already saved during checkpointing. It is recommended to ignore them using
self.save_hyperparameters(ignore=['logging_metrics']).
This is caused by
self.save_hyperparameters()
in init method of TemporalFusionTransformer, because save_hyperparameters() uses inspect and frame to identify all the hyperparameters,What's the reason to keep it or shall we add handling in init?