Lightning-AI / pytorch-lightning

Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
https://lightning.ai
Apache License 2.0
28.03k stars 3.36k forks source link

WandbLogger warning not logging logs. #2015

Closed rohitgr7 closed 4 years ago

rohitgr7 commented 4 years ago

🐛 Bug

WandbLogger giving warning: WARNING Adding to old History rows isn't currently supported. Step 25 < 38 and not logging when I try to use the WandbLogger with k-fold cross-validation because there I am using the same instance of wandb_logger but using trainer.fit multiple times for different train_dl and valid_dl. Since the step gets repeated in each case, it's not logging anything after the 1st fold is complete even though the log keys are completely different. It was working perfectly with pytorch-lightning v-0.7.4. For now, I have to create separate experiments for each fold which are hard to analyze on wandb.

To Reproduce

Code sample

Colab Notebook

Expected behavior

It should log even when the global_step is repeated in case if the logs keys are different.

Environment

Additional context

rohitgr7 commented 4 years ago

@borisdayma