TinkerTools / tinker-hp

Tinker-HP: High-Performance Massively Parallel Evolution of Tinker on CPUs & GPUs
http://tinker-hp.org/
Other
78 stars 24 forks source link

v1.1: Regression of inducedc_pme2 #3

Closed e-kwsm closed 4 years ago

e-kwsm commented 4 years ago

(I'm not sure if this issue is fixed in v1.2)

Tinker-HP v1.1 does not work completely. Execution of v1.1/example, e.g., mpirun -np 8 ../bin/dynamic dhfr2 10 2.0 1.0 2 300.0, raises error like the following:

 Molecular Dynamics Trajectory via r-RESPA MTS Algorithm
Fatal error in PMPI_Iallreduce: Invalid communicator, error stack:
PMPI_Iallreduce(496):  MPI_Iallreduce(sendbuf=MPI_IN_PLACE, recvbuf=0x55d36247a1e8, count=1, datatype=dtype=0x4c000829, op=MPI_SUM, comm=comm=0x0, request=0x7fffd96c9c10)
PMPI_Iallreduce(415): Invalid communicator
Fatal error in PMPI_Iallreduce: Invalid communicator, error stack:
PMPI_Iallreduce(496):  MPI_Iallreduce(sendbuf=MPI_IN_PLACE, recvbuf=0x55c464454868, count=1, datatype=dtype=0x4c000829, op=MPI_SUM, comm=comm=0x0, request=0x7ffecabc93f0)
PMPI_Iallreduce(415): Invalid communicator 

v1.0 does not have this issue.

The bug seems to be due to https://github.com/TinkerTools/Tinker-HP/blob/2da532e7f781306d168dd144f115e4f7192d684d/v1.1/source/dcinduce_pme2.f#L474-L475

Execution is successful if the comm_dir is replaced with MPI_COMM_WORLD.


Tinker-HP: 2da532e7f781306d168dd144f115e4f7192d684d MPI: Intel MPI 2018U3, MPICH 3.3.2 (GCC 8.3.0)

louislagardere commented 4 years ago

Yes this has been fixed in the 1.2 version