The ability to set the default_root_dir for checkpointing during train() of PyTorch Lighning GluonTS models seems to be missing. Being able to set default_root_dir arbitrarily would allow training of multiple models within the same root path and would simplify resource sharing.
Currently all models place their checkpoints and logs inside lightning_logs making it unnecessarily cumbersome to manage the trained models programmatically.
Description
The ability to set the
default_root_dir
for checkpointing duringtrain()
of PyTorch Lighning GluonTS models seems to be missing. Being able to setdefault_root_dir
arbitrarily would allow training of multiple models within the same root path and would simplify resource sharing.Currently all models place their checkpoints and logs inside
lightning_logs
making it unnecessarily cumbersome to manage the trained models programmatically.References
Reference: https://lightning.ai/docs/pytorch/stable/common/checkpointing_basic.html