Closed ArikReuter closed 3 months ago
This: "Trainer already configured with model summary callbacks: [<class 'lightning.pytorch.callbacks.model_summary.ModelSummary'>]. Skipping setting a default ModelSummary callback."
is expected. It just states, that we use a ModelSummary with depth=2 to account for our class inheritances.
Pruning at epoch 0 is unusual. Probably it returned None for the current score for the specific configuration. Could this be a suggested hparam problem where the suggested hparams lead to None values?
Could you also give details on the dataset? Was it performed on the DummyDataset?
The error does occur on the DummyDataset, but the configuration itself does not seem to cause a problem. Even if a config. leads to pruning, it can be used to fit the model itself
I will close this issue, since I did not experience any issues with any of the models on a real dataset. Lets reopen, when we find these problems again
When tuning TNTM, some trials are pruned with, for example, the message "[I 2024-08-06 18:41:49,776] Trial 14 pruned. Trial was pruned at epoch 0. "and "Trainer already configured with model summary callbacks: [<class 'lightning.pytorch.callbacks.model_summary.ModelSummary'>]. Skipping setting a default
ModelSummary
callback."Is this expected?