As I did not figure out how "nnmodel: torch.nn.Module" is defined, I left the NN definitions in the class:
NN is defined in the latter part of init and shifted_softplus0 (activation function), and thus called by forward and forward_c. You may change those parts when you train the NN weights.
"load" reads parameters from external numpy savedata, but this is not required for the training.
Thanks, the nnmodel is not the required part in your case. I used it because I modeled the xc as a hybrid of NN + ordinary xc. As long as your class define the abstractmethods, that should be fine.
I added the xc model of the constrained meta-GGA.
As I did not figure out how "nnmodel: torch.nn.Module" is defined, I left the NN definitions in the class: NN is defined in the latter part of init and shifted_softplus0 (activation function), and thus called by forward and forward_c. You may change those parts when you train the NN weights.
"load" reads parameters from external numpy savedata, but this is not required for the training.