protocaller / ProtoCaller

Full automation of relative protein-ligand binding free energy calculations in GROMACS
http://protocaller.readthedocs.io
GNU General Public License v3.0
43 stars 15 forks source link

How to specify GPU ID ? #9

Closed niko97320 closed 4 years ago

niko97320 commented 4 years ago

Dear all,

I run the code on a machine that has 3 GPU. By default Gromacs run on all GPU which in my case is far from ideal and brings down the performances. Is there any ways to specify the GPU to be used ?

Cheers

msuruzhon commented 4 years ago

Hi there,

I will admit that I have never run GROMACS on GPU's and right now you can only control the number of MPI nodes / CPU cores using ProtoCaller. I suppose an immediate alternative for you (although I appreciate not painless) is just to write a GROMACS bash script that runs the files output by ProtoCaller (this is in essence what ProtoCaller does but in Python). I will look into this when I have more time, but unfortunately I don't have access to many systems with several GPU's so this might take some time.

msuruzhon commented 4 years ago

Update: I have added some experimental support for a gpu_id argument passed on to mdrun. Also now you can make use of an additional gmx_kwargs dictionary parameter, which specifies any extra options you might want to use that are not yet supported by ProtoCaller. I think this makes it much more general than before.

The usage for gmx_kwargs is as follows: gmx_kwargs : dict Additional arguments to be passed to mdrun. The keys of the dictionary need to be the name of the option, e.g. "cpt" for checkpoint interval, while the values need to be the value of the option if it permits one or None if it doesn't. If the values contain "{}" while the user is running the lambda windows in serial, this will be replaced by the lambda number.

You can download the new development version of ProtoCaller from essexlab/label/dev and try it out. I haven't been able to test it on GPU's yet but it seems that gpu_id should be enough? Let me know what you think / if there are any problems. Cheers.