CederGroupHub / chgnet

Pretrained universal neural network potential for charge-informed atomistic modeling https://chgnet.lbl.gov
https://doi.org/10.1038/s42256-023-00716-3
Other
215 stars 55 forks source link

[FineTuning] #165

Closed sunjiwon closed 1 month ago

sunjiwon commented 1 month ago

Hello everyone,

I have a question regarding the fine-tuning process of CHGNet.

If I retrain CHGNet using energy, force, and stress data, will the model still utilize the magmom (magnetic moments) after the retraining process?

I would appreciate your insights on this matter.

Thank you and have a nice day!

BowenD-UCB commented 1 month ago

The model will still give a MAGMOM prediction, but please be cautious with the MAGMOMs predicted if you perform fine-tuning without MAGMOM labels. I would recommend performing some benchmarks with DFT, since neural networks are known to 'forget' after training on new labels.

sunjiwon commented 1 month ago

Hello everyone,

Thank you for your previous help.

I have one more question. As mentioned before, neural networks are known to "forget" after training on new labels. Does this mean that CHGNet will function like transfer learning and lose all its memory of the MP data?

I appreciate your insights on this matter.

Thank you again!