Lightning-AI / pytorch-lightning

Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.
https://lightning.ai
Apache License 2.0
28.37k stars 3.38k forks source link

add "no logging" option #645

Closed CarloLucibello closed 4 years ago

CarloLucibello commented 4 years ago

I may be wrong, but I see no way to entirely avoid logging during training, which sometimes may be convenient for quick exploratory experiments.

I suggest to have

trainer = Trainer(logger=None) 

construct a trainer that does no logging at all

CarloLucibello commented 4 years ago

wait, actually this avoids logging

trainer = Trainer(logger=False, checkpoint_callback=False)

If checkpoint_callback is not set, a checkpoints folder is created in the root directory.

Two suggestions:

If there is no support for these suggestions the issue can ble closed.

awaelchli commented 4 years ago

Setting logger=False should also trigger checkpoint_callback=False.

Maybe not. The user may want to save checkpoints but no logging turned on. The logger is for experiment tracking (loss curves and other visualizations) and independent of checkpointing.