Becksteinlab / GromacsWrapper

GromacsWrapper wraps system calls to GROMACS tools into thin Python classes (GROMACS 4.6.5 - 2024 supported).
https://gromacswrapper.readthedocs.org
GNU General Public License v3.0
169 stars 53 forks source link

How to run mdrun with mpiexec? #191

Closed dpadula85 closed 3 years ago

dpadula85 commented 3 years ago

Hello,

I went a bit through past issues and the code, but I can't seem to figure out how to change the configuration file so that mdrun gets called with an mpiexec -n prefix.

Can someone help me?

Thanks, D.

orbeckst commented 3 years ago

You can use gromacs.run.MDrunner. This code is not used that often and the implementation and documentation is not great, I am afraid. But perhaps it's useful for you:

You'll have to derive your own class from MDrunner. There isn't a lot of documentation so you have to look at the source code. There are some examples https://gromacswrapper.readthedocs.io/en/latest/gromacs/blocks/simulation.html#example-implementations (again, look at the source), for example MDrunnerOpenMP https://github.com/Becksteinlab/GromacsWrapper/blob/76ac5093be0031d1b451982bf5e72dc7b5fa15b1/gromacs/run.py#L283-L289

You can provide your full path to mpiexec in https://github.com/Becksteinlab/GromacsWrapper/blob/76ac5093be0031d1b451982bf5e72dc7b5fa15b1/gromacs/run.py#L289

Put the names of your mpi-enabled mdrun binaries in https://github.com/Becksteinlab/GromacsWrapper/blob/76ac5093be0031d1b451982bf5e72dc7b5fa15b1/gromacs/run.py#L288

If you need to run some code before or after you can use the prehook() and posthook() methods as show in MDrunnerMpich2Smpd https://github.com/Becksteinlab/GromacsWrapper/blob/76ac5093be0031d1b451982bf5e72dc7b5fa15b1/gromacs/run.py#L291 — but you might not need such a complicated setup.

Maybe define a class in your own code like this

import gromacs.run
class MDrunnerMPI(gromacs.run.MDrunner):
    """Manage running :program:`mdrun` as an MPI multiprocessor job.
    """
    mdrun = ("mdrun_mpi", "gmx_mpi mdrun")
    mpiexec = "/opt/local/bin/mpiexec"

and then use it as

mdrun_mpi = MDrunnerMPI(s="md.tpr", deffnm="md")     # add mdrun commands here
rc = mdrun_mpi.run(ncores=16)

The above basic MDrunnerMPI only supports running mpiexec -n ncores gmx mdrun ..., i.e., only the -n ncores arguments for mpiexec are supported. If you need more then you need write your own mpicommand() method, which you would add to your own MDrunnerMPI class. If this is the case, I suggest you just ask here if you need help.

dpadula85 commented 3 years ago

Thanks,

I had figured out the classes to use to override the default one! Your examples are of great help, and maybe this can be inserted in the docs!

Thanks again, D.

orbeckst commented 3 years ago

Glad that this was helpful.

I will re-open your issue as a reminder for documentation.