Valdes-Tresanco-MS / gmx_MMPBSA

gmx_MMPBSA is a new tool based on AMBER's MMPBSA.py aiming to perform end-state free energy calculations with GROMACS files.
https://valdes-tresanco-ms.github.io/gmx_MMPBSA/
GNU General Public License v3.0
226 stars 65 forks source link

gmx_mpi make_ndx falied when querying index file #26

Closed Byun-jinyoung closed 3 years ago

Byun-jinyoung commented 3 years ago

Describe the issue I've tried to calculate Interaction entropy with gmx_MMPBSA, but I got the error below

If you will report an error, please complete this form

To Reproduce Steps to reproduce the behavior:

  1. Which system type are you running? Protein-ligand system
  2. The command-line are you using. mpirun -np 4 gmx_MMPBSA MPI -O -i gmx_MMPBSA_ie.in -cs amber2gmx.pdb -ci index.ndx -cg 1 13 -ct nosolv_Nopbc_prod.xtc -cp amber2gmx.top

Additional context

[INFO   ] Started
[INFO   ] Loading and checking parameter files for compatibility...

[INFO   ] Building AMBER Topologies from GROMACS files...
[INFO   ] Checking if supported force fields exists in Amber data...
[INFO   ] Get PDB files from GROMACS structures files...
[INFO   ] Making gmx_MMPBSA index for complex...
*** An error occurred in MPI_Init_thread
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** An error occurred in MPI_Init_thread
***    and potentially your MPI job)
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[cpu5:239906] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee t\
hat all other processes were killed!
[cpu5:239905] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee t\
hat all other processes were killed!
*** An error occurred in MPI_Init_thread
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[cpu5:239907] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee t\
hat all other processes were killed!
[ERROR  ] /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi make_ndx failed when querying index.ndx
*** An error occurred in MPI_Init_thread
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[cpu5:239908] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee t\
hat all other processes were killed!
Valdes-Tresanco-MS commented 3 years ago

This error seems to be related to your GROMACS installation. Did you try to run GROMACS for your system outside of gmx_MMPBSA?

Byun-jinyoung commented 3 years ago

Yes. Is there any run GROMACS inside of gmx_MMPBSA? The path, /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi is my GROMACS path

Valdes-Tresanco-MS commented 3 years ago

Yes. Is there any run GROMACS inside of gmx_MMPBSA?

No, gmx_MMPBSA works with the GROMACS version that you have available in the PATH.

The path, /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi is my GROMACS path

I mean running GROMACS for your system. Apparently, the error is related to the compilation of gmx_mpi. The error occurs when trying to process the index. Please run gmx_mpi make_ndx -n index.ndx and see if it works properly.

We have several users who use the MPI version and have had no problems. In any case, we will try to help you solve the problem

Byun-jinyoung commented 3 years ago

Below is the result on the gmx_mpi make_ndx with command gmx_mpi make_ndx -f amber2gmx.pdb -o index.ndx

                     :-) GROMACS - gmx make_ndx, 2020.2 (-:

                            GROMACS is written by:
     Emile Apol      Rossen Apostolov      Paul Bauer     Herman J.C. Berendsen
    Par Bjelkmar      Christian Blau   Viacheslav Bolnykh     Kevin Boyd    
 Aldert van Buuren   Rudi van Drunen     Anton Feenstra       Alan Gray     
  Gerrit Groenhof     Anca Hamuraru    Vincent Hindriksen  M. Eric Irrgang  
  Aleksei Iupinov   Christoph Junghans     Joe Jordan     Dimitrios Karkoulis
    Peter Kasson        Jiri Kraus      Carsten Kutzner      Per Larsson    
  Justin A. Lemkul    Viveca Lindahl    Magnus Lundborg     Erik Marklund   
    Pascal Merz     Pieter Meulenhoff    Teemu Murtola       Szilard Pall   
    Sander Pronk      Roland Schulz      Michael Shirts    Alexey Shvetsov  
   Alfons Sijbers     Peter Tieleman      Jon Vincent      Teemu Virolainen 
 Christian Wennberg    Maarten Wolf      Artem Zhmurov   
                           and the project leaders:
        Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel

Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2019, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.

GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.

GROMACS:      gmx make_ndx, version 2020.2
Executable:   /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi
Data prefix:  /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2
Working dir:  /home/byun/PROject/TUBULIN_MD/T-P23C07/CHARMM36m-ff/amber/gmx_MMPBSA
Command line:
  gmx_mpi make_ndx -f amber2gmx.pdb -o index.ndx

Reading structure file

GROMACS reminds you: "Do You Have a Mind of Your Own ?" (Garbage)

Going to read 0 old index file(s)
Analysing residue names:
There are:   861    Protein residues
There are:     3      Other residues
There are:     4        Ion residues
Analysing Protein...
Analysing residues not classified as Protein/DNA/RNA/Water and splitting into groups...
Analysing residues not classified as Protein/DNA/RNA/Water and splitting into groups...

  0 System              : 13444 atoms
  1 Protein             : 13307 atoms
  2 Protein-H           :  6758 atoms
  3 C-alpha             :   861 atoms
  4 Backbone            :  2583 atoms
  5 MainChain           :  3442 atoms
  6 MainChain+Cb        :  4238 atoms
  7 MainChain+H         :  4261 atoms
  8 SideChain           :  9046 atoms
  9 SideChain-H         :  3316 atoms
 10 Prot-Masses         : 13307 atoms
 11 non-Protein         :   137 atoms
 12 Other               :   133 atoms
 13 GTP                 :    48 atoms
 14 MG                  :     2 atoms
 15 CAL                 :     2 atoms
 16 GDP                 :    43 atoms
 17 LIG                 :    42 atoms
 18 Ion                 :     4 atoms
 19 GTP                 :    48 atoms
 20 MG                  :     2 atoms
 21 CAL                 :     2 atoms
 22 GDP                 :    43 atoms
 23 LIG                 :    42 atoms

 nr : group      '!': not  'name' nr name   'splitch' nr    Enter: list groups
 'a': atom       '&': and  'del' nr         'splitres' nr   'l': list residues
 't': atom type  '|': or   'keep' nr        'splitat' nr    'h': help
 'r': residue              'res' nr         'chain' char
 "name": group             'case': case sensitive           'q': save and quit
 'ri': residue index

> 

If I get help, could you help me solve the problem..

Valdes-Tresanco-MS commented 3 years ago

After some tests, I have noticed that gmx_MMPBSA with MPI has problems. I will try to solve them in the shortest possible time

Byun-jinyoung commented 3 years ago

Thank you for considering my issue :)

Valdes-Tresanco-MS commented 3 years ago

I already identified the problem and I have a PR that I will merge as soon as I have finished testing it. The error is related to an incompatibility of gmx_mpi (GROMACS executable) with mpi4py (Python module to handle the MPI). This incompatibility generates a segmentation fault. This could be solved in two ways:

  1. It will recompile GROMACS and mpi4py with the same version of OpenMPI which is cumbersome, does not ensure compatibility, and will not be used by most users and the other
  2. remove support for gmx_mpi and gmx_mpi_d when running gmx_MMPBSA in parallel. This is not a big change since gmx works perfectly
Valdes-Tresanco-MS commented 3 years ago

Closing this issue since it was solved in the new release.