Yi-FanLi / DP-PIMD

A lammps fix module to perform path integral molecular dynamics (PIMD) tasks.
8 stars 1 forks source link

lammps running fix dp_pimd get MPI Errors #1

Closed appassionate closed 1 year ago

appassionate commented 1 year ago

Hi! im trying to compile the fix dp_pimd lammps plugin mentioned in the Overview.md. It seems that branch: beads in your forked lammps has no dp_pimd implementation yet, and i found it at branch: master. I have successfully complied the lammps binary in branch:master, but when im trying to execute it in our cluster, it got those following errors...

cd ./DP-PIMD/rho=1_nvt
mpirun -np 4 ~/software/lammps_master/src/lmp_mpi -p 4x1 -in in.lj_nvt -log log -screen screen```
Setting up Path-Integral ...
Fatal error in MPI_Allreduce: Invalid buffer pointer, error stack:
MPI_Allreduce(1555): MPI_Allreduce(sbuf=0x3be8898, rbuf=0x3be8898, count=1, MPI_DOUBLE, MPI_SUM, MPI_COMM_WORLD) failed
MPI_Allreduce(1509): Buffers must not be aliased
Fatal error in MPI_Allreduce: Invalid buffer pointer, error stack:
MPI_Allreduce(1555): MPI_Allreduce(sbuf=0x36589e8, rbuf=0x36589e8, count=1, MPI_DOUBLE, MPI_SUM, MPI_COMM_WORLD) failed
MPI_Allreduce(1509): Buffers must not be aliased

we are using

intel mpi: mpi/intel/2017.5
gcc: 9.3.0
deepmd-kit:2.1.5
cuda: 11.3

Is it some errors in MPI communication? i guess...

Many thanks! : )

Yi-FanLi commented 1 year ago

Hi Xiong, Thanks for your interest. Finally I renamed this fix style as pimd/langevin. You can use it by installing the develop branch of lammps from its official repository. Also, the document can be found here: https://docs.lammps.org/latest/fix_pimd.html

If you need further assistance, you can email me.

Best, Yifan

On Wed, Jun 14, 2023 at 5:12 AM xiong ke @.***> wrote:

Hi! im trying to compling the fix dp_pimd lammps plugin mentioned in the Overview.md. It seems that branch: beads in your forked lammps has no dp_pimd implementation yet, and i found it at branch: master. I have successfully complied the lammps binary in branch:master, but when im trying to execute it in our cluster, it got those following errors...

cd ./DP-PIMD/rho=1_nvt mpirun -np 4 ~/software/lammps_master/src/lmp_mpi -p 4x1 -in in.lj_nvt -log log -screen screen```

Setting up Path-Integral ... Fatal error in MPI_Allreduce: Invalid buffer pointer, error stack: MPI_Allreduce(1555): MPI_Allreduce(sbuf=0x3be8898, rbuf=0x3be8898, count=1, MPI_DOUBLE, MPI_SUM, MPI_COMM_WORLD) failed MPI_Allreduce(1509): Buffers must not be aliased Fatal error in MPI_Allreduce: Invalid buffer pointer, error stack: MPI_Allreduce(1555): MPI_Allreduce(sbuf=0x36589e8, rbuf=0x36589e8, count=1, MPI_DOUBLE, MPI_SUM, MPI_COMM_WORLD) failed MPI_Allreduce(1509): Buffers must not be aliased

Is it some errors in MPI communication? i guess...

Many thanks! : )

— Reply to this email directly, view it on GitHub https://github.com/Yi-FanLi/DP-PIMD/issues/1, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJMTTWIBJ7S7XSIAWS62KNTXLF56TANCNFSM6AAAAAAZGBOF7Q . You are receiving this because you are subscribed to this thread.Message ID: @.***>

appassionate commented 1 year ago

@Yi-FanLi Many thanks for your hint! I have tried to complie the newest lammps software version March 28 within deepmd-kit 2.2.0 , it can run now.