openkim / kliff

KIM-based Learning-Integrated Fitting Framework for interatomic potentials.
https://kliff.readthedocs.io
GNU Lesser General Public License v2.1
34 stars 20 forks source link

Is it possible to specify different lr for different layer or fix some layer? #8

Closed funan-jhc-lee closed 3 years ago

funan-jhc-lee commented 3 years ago

Hello!As stated in the title, is it possible to specify different lr for different layer or fix some layer, when retraining a model?I‘ve read the document, but It seems not possible. Thanks a lot for taking the time to answer this question!

mjwen commented 3 years ago

do the fitting

...

save the model

load the model back

model.load(model_saved_path)

for name, p in model.named_parameters(): if keyword in name: # keyword is some keyword in the layer you want to fix, e.g. layer1 p.requires_grad = False



- different lr requires multiple optimizers; unfortunately it is not supported out of the box now.