"Periodic Graph Transformers for Crystal Material Property Prediction" by Keqiang Yan and Yi Liu and Yuchao Lin and Shuiwang Ji (2022)
Abstract:
We consider representation learning on periodic graphs encoding crystal materials. Different from regular graphs, periodic graphs consist of a minimum unit cell repeating itself on a regular lattice in 3D space. How to effectively encode these periodic structures poses unique challenges not present in regular graph representation learning. In addition to being E(3) invariant, periodic graph representations need to be periodic invariant. That is, the learned representations should be invariant to shifts of cell boundaries as they are artificially imposed. Furthermore, the periodic repeating patterns need to be captured explicitly as lattices of different sizes and orientations may correspond to different materials. In this work, we propose a transformer architecture, known as Matformer, for periodic graph representation learning. Our Matformer is designed to be invariant to periodicity and can capture repeating patterns explicitly. In particular, Matformer encodes periodic patterns by efficient use of geometric distances between the same atoms in neighboring cells. Experimental results on multiple common benchmark datasets show that our Matformer outperforms baseline methods consistently. In addition, our results demonstrate the importance of periodic invariance and explicit repeating pattern encoding for crystal representation learning.
Brief description of your algorithm
This version has not been submitted by the original authors and has a modified training script, since the original version is not capable to train on the official matbench. All adjusted files are attached. The model and the parameters are from the original github repository (https://github.com/YKQ98/Matformer), which itself builds on the code basis of ALIGNN.
Included files
We had to modify a few files for training on matbench:
-- benchmarks
---- matbench_v0.1_matformer
------ results.json.gz # required filename
------ train_matbench.py # required filename
------ info.json # required filename
------ train.py # additional dependency for the training function
------ train_on_folder.py # additional dependency for the training function
------ data.py # additional dependency to load data
------ config.py # additional dependency to prepare config
------ config_example.json # example config
Benchmark submissions
Submission of Matformer benchmark results.
"Periodic Graph Transformers for Crystal Material Property Prediction" by Keqiang Yan and Yi Liu and Yuchao Lin and Shuiwang Ji (2022)
Abstract: We consider representation learning on periodic graphs encoding crystal materials. Different from regular graphs, periodic graphs consist of a minimum unit cell repeating itself on a regular lattice in 3D space. How to effectively encode these periodic structures poses unique challenges not present in regular graph representation learning. In addition to being E(3) invariant, periodic graph representations need to be periodic invariant. That is, the learned representations should be invariant to shifts of cell boundaries as they are artificially imposed. Furthermore, the periodic repeating patterns need to be captured explicitly as lattices of different sizes and orientations may correspond to different materials. In this work, we propose a transformer architecture, known as Matformer, for periodic graph representation learning. Our Matformer is designed to be invariant to periodicity and can capture repeating patterns explicitly. In particular, Matformer encodes periodic patterns by efficient use of geometric distances between the same atoms in neighboring cells. Experimental results on multiple common benchmark datasets show that our Matformer outperforms baseline methods consistently. In addition, our results demonstrate the importance of periodic invariance and explicit repeating pattern encoding for crystal representation learning.
Brief description of your algorithm
This version has not been submitted by the original authors and has a modified training script, since the original version is not capable to train on the official matbench. All adjusted files are attached. The model and the parameters are from the original github repository (https://github.com/YKQ98/Matformer), which itself builds on the code basis of ALIGNN.
Included files
We had to modify a few files for training on matbench: