materialsvirtuallab / megnet

Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals
BSD 3-Clause "New" or "Revised" License
502 stars 156 forks source link

copy layers error? #286

Open tony-dd opened 3 years ago

tony-dd commented 3 years ago

Hello! I found that the 'formation_energy.hdf5' model downloaded from github gave different prediction results with a copied layers model. Very strange... any idea to fix it?

Example code is below:

from megnet.models import MEGNetModel from megnet.data.graph import GaussianDistance from megnet.data.crystal import CrystalGraph import numpy as np from monty.serialization import loadfn

load data

data = loadfn('bulk_moduli.json') structures = data['structures']

gc = CrystalGraph(bond_converter=GaussianDistance( np.linspace(0, 5, 100), 0.5), cutoff=4) model_replace_layers = MEGNetModel(100, 2, nblocks=3, nvocal=95, embedding_dim=16, graph_converter=gc) model_formation_energy = MEGNetModel.from_file('formation_energy.hdf5')

#replace layers for index, layer in enumerate(model_replace_layers.layers): weights_trained_model = model_formation_energy.layers[index].get_weights() model_replace_layers.layers[index].set_weights(weights_trained_model)

predicte_1 = model_formation_energy.predict_structure(structures[0]) print(predicte_1) # [-0.27339065] predicte_2 = model_replace_layers.predict_structure(structures[0]) print(predicte_2) # [0.17409758]

chc273 commented 3 years ago

@tony-dd check if all configurations are the same for the two models. For example in the Set2Set layer the number of steps T maybe different, which does not change the number of trainable weights, but will give different results.

tony-dd commented 3 years ago

check from xxx.hdf5.json or somewhere else? thanks

chc273 commented 3 years ago

No when you have the model you can get the model config. Or you can loop all layers in model and use layer’s get_config method to get the layer config parameters

tony-dd commented 3 years ago

I have followed your suggestion. Indeed, the configurations of the two models are different. Could you please tell me how can I init the model that have the same configurations with the xxx.hdf5?

chc273 commented 3 years ago

You can check the MEGNetModel class constructor (__init__(self, ...) method), all parameters should be there. You just have to figure out which parameter needs to change. It should be relatively straightforward

tony-dd commented 3 years ago

Thank you very much for your suggestions. Could you please tell me why I use the same method to init ‘G.hdf5’, but there is a 'ValueError'.

Example code is below:

from megnet.models import MEGNetModel from megnet.data.graph import GaussianDistance from ase.db import connect from megnet.data.crystal import CrystalGraph import numpy as np from pymatgen.io.ase import AseAtomsAdaptor aaa = AseAtomsAdaptor()

load data

db = connect('qm9.db') rows = list(db.select('id<5')) structures = [] for row in rows: atoms = row.toatoms() struct = aaa.get_molecule(atoms, cls=None) structures.append(struct)

gc = CrystalGraph(bond_converter=GaussianDistance( np.linspace(0, 5, 100), 0.5), cutoff=4)

model_replace_layers = MEGNetModel(100, 2, nvocal=95, graph_converter=gc)

model_free_energy = MEGNetModel.from_file('G.hdf5')

replace layers

for index, layer in enumerate(model_replace_layers.layers): weights_trained_model = model_free_energy.layers[index].get_weights() model_replace_layers.layers[index].set_weights(weights_trained_model)

Err:

File "D:/megnet/test_model.py", line 28, in model_replace_layers.layers[index].set_weights(weights_trained_model) File "D:\Anaconda\envs\dd\lib\site-packages\tensorflow\python\keras\engine\base_layer.py", line 1538, in set_weights raise ValueError( ValueError: Layer weight shape (95, 16) not compatible with provided weight shape (9, 16)

chc273 commented 3 years ago

The QM9 model does not use 95 elements. Also, your code assumes that the layers of the two models are placed in the same order, which is not necessarily true.

Please just use https://github.com/materialsvirtuallab/megnet/blob/master/notebooks/qm9_pretrained.ipynb for pretrained qm9 models.