choderalab / modelforge

Infrastructure to implement and train NNPs
https://modelforge.readthedocs.io/en/latest/
MIT License
11 stars 4 forks source link

Model have unused parameters in forward pass #131

Open wiederm opened 4 months ago

wiederm commented 4 months ago

Training with PyTorch Lightning and the distributed data parallel strategy requires that all parameters that are used in the forward pass are also involved in the backward pass. While there is a keyword to allow for unused parameters, this slows down training significantly. This is an issue with ANI2x and SAKE.