The Compositionally-Restricted Attention-Based Network (CrabNet
), inspired by natural language processing transformers, uses compositional information to predict material properties.
<img src=https://user-images.githubusercontent.com/45469701/155030619-3a5f75e8-b28d-4801-a54c-58a800ee874c.png width=150>
:warning: This is a fork of the original CrabNet repository :warning:
This is a refactored version of CrabNet, published to PyPI (pip
) and Anaconda
(conda
). In addition to using .csv
files, it allows direct passing of Pandas
DataFrames as training and validation datasets, similar to
automatminer. It also exposes many of
the model parameters at the top-level via CrabNet
and uses the sklearn
-like "instantiate, fit, predict" workflow. An extend_features
is
implemented which allows utilization of data other than the elemental compositions (e.g.
state variables such as temperature or applied load). These changes make CrabNet
portable, extensible, and more broadly applicable, and will be incorporated into the parent repository at a later
date. Please refer to the CrabNet documentation for details on installation and usage. If you find CrabNet useful, please consider citing the following publication in npj Computational Materials:
@article{Wang2021crabnet,
author = {Wang, Anthony Yu-Tung and Kauwe, Steven K. and Murdock, Ryan J. and Sparks, Taylor D.},
year = {2021},
title = {Compositionally restricted attention-based network for materials property predictions},
pages = {77},
volume = {7},
number = {1},
doi = {10.1038/s41524-021-00545-1},
publisher = {{Nature Publishing Group}},
shortjournal = {npj Comput. Mater.},
journal = {npj Computational Materials}
}