Closed hsulab closed 3 years ago
Hi Jiayan,
Thanks for catching this. This portion was recently modified and I must've missed testing the debug
case. I have updated the code and respective tests to account for this - #97 . Hope this resolves it for you!
FWIW - since debug
mode prevents writing checkpoints, etc. to disk, the model trained by will load the parameters of the last epoch rather than the best epoch (that would have been saved to disk if debug=False
).
Hi Jiayan,
Thanks for catching this. This portion was recently modified and I must've missed testing the
debug
case. I have updated the code and respective tests to account for this - #97 . Hope this resolves it for you!FWIW - since
debug
mode prevents writing checkpoints, etc. to disk, the model trained by will load the parameters of the last epoch rather than the best epoch (that would have been saved to disk ifdebug=False
).
Thanks again! This helps me a lot. I will download the new code and the issue can be closed.
Dear Developers and Users,
When I set the debug to true in the config, the training worked well but an error occurred during the prediction.
After checking the source code, I found the following lines gave rise to this issue.
https://github.com/ulissigroup/amptorch/blob/bd8af57cdfbe323f08b9480585904c52dc3821f8/amptorch/trainer.py#L128
I don't understand why
self.config["dataset"]["descriptor"]
is only set whenif not self.debug
? Should this be set no matter whether debug is on?Many thanks, Jiayan