Dora is an experiment management framework. It expresses grid searches as pure python files as part of your repo. It identifies experiments with a unique hash signature. Scale up to hundreds of experiments without losing your sanity.
When calling trainer = dora.lightning.get_trainer() I get
Traceback (most recent call last):
File "/private/home/louismartin/dev/tap/tap/train.py", line 60, in main
train(cfg=cfg)
File "/private/home/louismartin/dev/tap/tap/train.py", line 26, in train
trainer = dora.lightning.get_trainer()
File "/private/home/louismartin/dev/dora/dora/lightning.py", line 175, in get_trainer
callbacks.append(DoraCheckpointSync())
AttributeError: 'NoneType' object has no attribute 'append'
This is because the default value for callbacks in the Trainer.__init__ signature (retrieved here kwargs = inspect.getcallargs(init, [None] + list(args), **kwargs)) is None.
Hence callbacks = kwargs.pop("callbacks", []) returns None instead of [].
https://github.com/facebookresearch/dora/blob/a23ce113d4e85fe1d7d40d82faea8013d2bc44a8/dora/lightning.py#L173
When calling
trainer = dora.lightning.get_trainer()
I getThis is because the default value for
callbacks
in theTrainer.__init__
signature (retrieved herekwargs = inspect.getcallargs(init, [None] + list(args), **kwargs)
) is None. Hencecallbacks = kwargs.pop("callbacks", [])
returnsNone
instead of[]
.pytorch-lightning==1.5.10