ACEsuit / mace

MACE - Fast and accurate machine learning interatomic potentials with higher order equivariant message passing.
Other
515 stars 193 forks source link

MACE LAMMPS in plugin mode #271

Open saltball opened 10 months ago

saltball commented 10 months ago

Hi there,

In addition to the progress made on adapting the LAMMPS code to operate in plugin mode, I’ve also attempted to modify the plugin to function as a true plugin and have successfully released it via conda-build. This effort aimed to demonstrate the feasibility and benefits of a Conda release alongside the previously discussed PyPI packaging(#258) . The modified plugin code, now in plugin mode, is available at https://github.com/XJTU-ICP/mace_lammps_plugin.

To facilitate easy installation, I’ve provided details for creating a Conda package for the plugin. You can find the recipe at https://github.com/XJTU-ICP/mace_lammps_plugin_recipe/. This Conda release provides an alternative distribution method for users who prefer Conda environments over PyPI. (only lammps part)

This complements the existing PyPI discussion and could potentially widen the reach of the MACE project to users who predominantly use Conda for package management.

Currently, these Conda packages can be installed using the following command:

micromamba install lammps-precxx11 mace_lammps_plugin -c xjtuicp -c conda-forge

I’ve conducted testing exclusively with PyTorch 2.1. This is quite surprising given that the MACE documentation appears to indicate a lack of support for PyTorch 2.1. As a result, I’m uncertain about the accuracy of the outcome.

I hope this showcases the flexibility of the integration and its potential to accommodate various user preferences. Please let me know if there’s anything else you’d like to discuss regarding this integration or the Conda release.

ilyes319 commented 10 months ago

Hey,

Thank you very much for your help with that. @wcwitt How do you think this should interact with the curent lammps documentation?

Side note, upon further investigation, I think PyTorch 2.1 is supported.

wcwitt commented 10 months ago

Hi - can you explain what this means?

In addition to the progress made on adapting the LAMMPS code to operate in plugin mode

saltball commented 10 months ago

I have detailed the specific principles below, attempting to cover as much detail as possible. ( for developers )

1 About the plugin

The plugin code (https://github.com/XJTU-ICP/mace_lammps_plugin) is modified from https://github.com/ACEsuit/lammps, but using plugin package feature of lammps.

For users, the functionality I provide utilizes the LAMMPS PLUGIN package (see https://docs.lammps.org/plugin.html) to implement plugin features. This allows users to download and update a library of approximately tens of kilobytes, as opposed to several tens of megabytes of binary files and a comparatively complex source code compilation process.

For developers, it enables users to confine the code that needs maintenance to a few files within the plugin, thus avoiding the need to handle the entire LAMMPS source code. However, some functionalities may not be suitable for this approach (e.g., complex GPU and Kokkos packages), but for many novice users, this approach is sufficiently rapid.

2 About why I offer a lammps-precxx11

In this plugin and conda environment, I using the dynamic libraries included in the PyTorch channel provided by Anaconda, for compilation, instead of using the standalone distribution of LibTorch, which amounts to several gigabytes.

The lammps in conda-forge channel is cxx11ABI in fact, also the default settings of lammps source when compiling. In fact, even with just a portion of LibTorch, there exists a distinction between pre-cxx11ABI and cxx11ABI for users to choose. By using pre-cxx11ABI LAMMPS, I can conveniently call the cxx11ABI dynamic library from PyTorch, effectively reducing the size of the package users need to install. Although optimization is still possible, at least manual downloading of LibTorch is no longer necessary.(users just needs install there pytorch from pytorch channel as mace doc says)

3 About the recipe repos I offer

The recipe I provide (including LAMMPS and the plugin) consists of some files used for conda-build, with the most important ones being meta.yaml and build.sh (see https://docs.conda.io/projects/conda-build/en/latest/resources/define-metadata.html for conda-build).

If the developers of MACE find this information and repo useful, I suggest developers consider referring to the configuration for conda-build and package upload in Anaconda (https://docs.anaconda.com/free/anacondaorg/user-guide/packages/conda-packages/) to release the packages I mentioned. Of course, subsequent maintenance might need to be carried out under the developer’s name.

Similarly, the release on PyPI and conda does not conflict, but some minor modifications may be enough to enable the release of MACE package and LAMMPS plugin in a custom channel on Anaconda. This may be another solution for conda users similar with #258.


The two packages I provide can be directly installed and used by users who have already configured their PyTorch environment (currently limited to CUDA 12.1 environment) according to the documentation. For instance, if a user is installing PyTorch using the conda install section of the MACE documentation (https://mace-docs.readthedocs.io/en/latest/guide/installation.html#conda-installation), they can simply execute conda install lammps-precxx11 mace_lammps_plugin -c xjtuicp -c conda-forge to install a LAMMPS capable of running MACE models as lmp -in in.lammps. This greatly simplifies the installation steps for lammps users, particularly as outlined in the installation section of https://mace-docs.readthedocs.io/en/latest/guide/lammps.html. However, current version may only be applicable to a few new users and may require subsequent maintenance by the developers, as I have removed the Kokkos part.