Closed Byun-jinyoung closed 3 years ago
This error seems to be related to your GROMACS installation. Did you try to run GROMACS for your system outside of gmx_MMPBSA?
Yes. Is there any run GROMACS inside of gmx_MMPBSA? The path, /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi is my GROMACS path
Yes. Is there any run GROMACS inside of gmx_MMPBSA?
No, gmx_MMPBSA
works with the GROMACS version that you have available in the PATH.
The path, /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi is my GROMACS path
I mean running GROMACS for your system. Apparently, the error is related to the compilation of gmx_mpi
. The error occurs when trying to process the index. Please run gmx_mpi make_ndx -n index.ndx
and see if it works properly.
We have several users who use the MPI version and have had no problems. In any case, we will try to help you solve the problem
Below is the result on the gmx_mpi make_ndx with command gmx_mpi make_ndx -f amber2gmx.pdb -o index.ndx
:-) GROMACS - gmx make_ndx, 2020.2 (-:
GROMACS is written by:
Emile Apol Rossen Apostolov Paul Bauer Herman J.C. Berendsen
Par Bjelkmar Christian Blau Viacheslav Bolnykh Kevin Boyd
Aldert van Buuren Rudi van Drunen Anton Feenstra Alan Gray
Gerrit Groenhof Anca Hamuraru Vincent Hindriksen M. Eric Irrgang
Aleksei Iupinov Christoph Junghans Joe Jordan Dimitrios Karkoulis
Peter Kasson Jiri Kraus Carsten Kutzner Per Larsson
Justin A. Lemkul Viveca Lindahl Magnus Lundborg Erik Marklund
Pascal Merz Pieter Meulenhoff Teemu Murtola Szilard Pall
Sander Pronk Roland Schulz Michael Shirts Alexey Shvetsov
Alfons Sijbers Peter Tieleman Jon Vincent Teemu Virolainen
Christian Wennberg Maarten Wolf Artem Zhmurov
and the project leaders:
Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel
Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2019, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.
GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.
GROMACS: gmx make_ndx, version 2020.2
Executable: /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi
Data prefix: /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2
Working dir: /home/byun/PROject/TUBULIN_MD/T-P23C07/CHARMM36m-ff/amber/gmx_MMPBSA
Command line:
gmx_mpi make_ndx -f amber2gmx.pdb -o index.ndx
Reading structure file
GROMACS reminds you: "Do You Have a Mind of Your Own ?" (Garbage)
Going to read 0 old index file(s)
Analysing residue names:
There are: 861 Protein residues
There are: 3 Other residues
There are: 4 Ion residues
Analysing Protein...
Analysing residues not classified as Protein/DNA/RNA/Water and splitting into groups...
Analysing residues not classified as Protein/DNA/RNA/Water and splitting into groups...
0 System : 13444 atoms
1 Protein : 13307 atoms
2 Protein-H : 6758 atoms
3 C-alpha : 861 atoms
4 Backbone : 2583 atoms
5 MainChain : 3442 atoms
6 MainChain+Cb : 4238 atoms
7 MainChain+H : 4261 atoms
8 SideChain : 9046 atoms
9 SideChain-H : 3316 atoms
10 Prot-Masses : 13307 atoms
11 non-Protein : 137 atoms
12 Other : 133 atoms
13 GTP : 48 atoms
14 MG : 2 atoms
15 CAL : 2 atoms
16 GDP : 43 atoms
17 LIG : 42 atoms
18 Ion : 4 atoms
19 GTP : 48 atoms
20 MG : 2 atoms
21 CAL : 2 atoms
22 GDP : 43 atoms
23 LIG : 42 atoms
nr : group '!': not 'name' nr name 'splitch' nr Enter: list groups
'a': atom '&': and 'del' nr 'splitres' nr 'l': list residues
't': atom type '|': or 'keep' nr 'splitat' nr 'h': help
'r': residue 'res' nr 'chain' char
"name": group 'case': case sensitive 'q': save and quit
'ri': residue index
>
If I get help, could you help me solve the problem..
After some tests, I have noticed that gmx_MMPBSA
with MPI has problems. I will try to solve them in the shortest possible time
Thank you for considering my issue :)
I already identified the problem and I have a PR that I will merge as soon as I have finished testing it. The error is related to an incompatibility of gmx_mpi
(GROMACS executable) with mpi4py
(Python module to handle the MPI). This incompatibility generates a segmentation fault. This could be solved in two ways:
gmx_mpi
and gmx_mpi_d
when running gmx_MMPBSA in parallel. This is not a big change since gmx
works perfectlyClosing this issue since it was solved in the new release.
Describe the issue I've tried to calculate Interaction entropy with gmx_MMPBSA, but I got the error below
If you will report an error, please complete this form
To Reproduce Steps to reproduce the behavior:
mpirun -np 4 gmx_MMPBSA MPI -O -i gmx_MMPBSA_ie.in -cs amber2gmx.pdb -ci index.ndx -cg 1 13 -ct nosolv_Nopbc_prod.xtc -cp amber2gmx.top
Additional context