TJFord / palabos-lammps

20 stars 9 forks source link

The MPI_Finalize() function was called after MPI_FINALIZE was invoked; Attempting to use an MPI routine after finalizing MPI #1

Open nbuzwh opened 5 years ago

nbuzwh commented 5 years ago

Below I give my installation process and commands I input. Please help me check if anywhere has a problem.

My Linux system is openSUSE Leap 42.3. The mpi that I installed is: openmpi-3.1.3rc2.

  1. installed the openmpi first and the mpi_hello_world code runs well.
  2. downloaded palabos-v2.0r0, lammps-5Sep18, palabos-lammps-master source codes and unzip them.
  3. added the files in lammps-ext (fix_fcm.cpp and fix_fcm.h) to lammps/src. we also checked that one example of palabos software ran well. we ran cd lammps/src make yes-mc
  4. compiled lammps as a lib by using the commands: "make mode=lib mpi" in lammps/src. Then a file named liblammps_mpi.a was generated.
  5. modifed the Makefile in the example in palabos-lammps-master/example, make sure the coupling/src is in the includePaths, the libraryPaths should link to lammps/src. The specific changes are:

a. Leading directory of the Palabos source code palabosRoot = /.../LAMMPS_and_Palabos/palabos-v2.0r0

b. Path to external libraries (other than Palabos) libraryPaths = /.../LAMMPS_and_Palabos/lammps-5Sep18/src

c. Path to inlude directories (other than Palabos) includePaths = /.../LAMMPS_and_Palabos/lammps-5Sep18/src
/.../LAMMPS_and_Palabos/palabos-lammps-master/src

d. Dynamic and static libraries (other than Palabos) libraries = liblammps_mpi.a

  1. input "make" command to compile in the directory palabos-lammps-master/example. A file name embolism was generated. Then I input "mpirun -np 2 ./embolism in.embolism", there is an error like: The MPI_Finalize() function was called after MPI_FINALIZE was invoked. This is disallowed by the MPI standard. *** Your MPI job will now abort.

When I install the coupling software on a another server by using above steps, similar error occurs: Attempting to use an MPI routine after finalizing MPI Attempting to use an MPI routine after finalizing MPI

The Linux system is CentOS Linux release 7.2.1511 (Core) MPI version is Intel(R) MPI Library for Linux* OS, Version 2017 Update 2 Build 20170125

Can you give me a step by step solution? Thank you.

TJFord commented 5 years ago

Below I give my installation process and commands I input. Please help me check if anywhere has a problem.

My Linux system is openSUSE Leap 42.3. The mpi that I installed is: openmpi-3.1.3rc2.

  1. installed the openmpi first and the mpi_hello_world code runs well.
  2. downloaded palabos-v2.0r0, lammps-5Sep18, palabos-lammps-master source codes and unzip them.
  3. added the files in lammps-ext (fix_fcm.cpp and fix_fcm.h) to lammps/src. we also checked that one example of palabos software ran well. we ran cd lammps/src make yes-mc

You also need to install molecule package: make yes-molecule @nbuzwh

  1. compiled lammps as a lib by using the commands: "make mode=lib mpi" in lammps/src. Then a file named liblammps_mpi.a was generated.
  2. modifed the Makefile in the example in palabos-lammps-master/example, make sure the coupling/src is in the includePaths, the libraryPaths should link to lammps/src. The specific changes are:

a. Leading directory of the Palabos source code palabosRoot = /.../LAMMPS_and_Palabos/palabos-v2.0r0

b. Path to external libraries (other than Palabos) libraryPaths = /.../LAMMPS_and_Palabos/lammps-5Sep18/src

c. Path to inlude directories (other than Palabos) includePaths = /.../LAMMPS_and_Palabos/lammps-5Sep18/src /.../LAMMPS_and_Palabos/palabos-lammps-master/src

d. Dynamic and static libraries (other than Palabos) libraries = liblammps_mpi.a

  1. input "make" command to compile in the directory palabos-lammps-master/example. A file name embolism was generated. Then I input "mpirun -np 2 ./embolism in.embolism", there is an error like: The MPI_Finalize() function was called after MPI_FINALIZE was invoked. This is disallowed by the MPI standard. *** Your MPI job will now abort.

When I install the coupling software on a another server by using above steps, similar error occurs: Attempting to use an MPI routine after finalizing MPI Attempting to use an MPI routine after finalizing MPI

The Linux system is CentOS Linux release 7.2.1511 (Core) MPI version is Intel(R) MPI Library for Linux* OS, Version 2017 Update 2 Build 20170125

Can you give me a step by step solution? Thank you.

As this is a complex code, you need to install appropriate packages before using the code. Besides MC package, you also need to install Molecule package. see the comments above. Let me know if you have more questions. @nbuzwh

czbebe commented 5 years ago

I met the same errors on another server (CentOS7/openmpi) Attempting to use an MPI routine after finalizing MPI Attempting to use an MPI routine after finalizing MPI Program stops at fix 3 all bond.create 1000 1 2 2.0 1 iparam 2 2 jparam 4 3 prob 0.5 23854

TJFord commented 5 years ago

I met the same errors on another server (CentOS7/openmpi) Attempting to use an MPI routine after finalizing MPI Attempting to use an MPI routine after finalizing MPI Program stops at fix 3 all bond.create 1000 1 2 2.0 1 iparam 2 2 jparam 4 3 prob 0.5 23854

Did it help in fixing your error?

czbebe commented 5 years ago

fix 3 all bond.create 1000 1 2 2.0 1 iparam 2 2 jparam 4 3 prob 0.5 23854 may be recognized by lammps as follows. Problem is mpi related.

[oka@localhost example]$ mpirun -np 1 /home/oka/Desktop/Scott/lammps-17Nov16/src/lmp_mpi < in.embolism LAMMPS (17 Nov 2016) Lattice spacing in x,y,z = 1 1 1 Reading data file ... orthogonal box = (0 0 0) to (20 20 80) 1 by 1 by 1 MPI processor grid reading atoms ... 684 atoms Finding 1-2 1-3 1-4 neighbors ... Special bond factors lj: 0 0 0
Special bond factors coul: 0 0 0
0 = max # of 1-2 neighbors 0 = max # of 1-3 neighbors 0 = max # of 1-4 neighbors 200 = max # of special neighbors 652 atoms in group cells 32 atoms in group wall_plt WARNING: One or more atoms are time integrated more than once (../modify.cpp:271) Neighbor list info ... 4 neighbor list requests update every 1 steps, delay 0 steps, check no max neighbors/atom: 2000, page size: 100000 master list distance cutoff = 3 ghost atom cutoff = 12 binsize = 1.5, bins = 14 14 54 Setting up Verlet run ... Unit style : lj Current step : 0 Time step : 1 Memory usage per processor = 23.672 Mbytes Step Temp E_pair E_mol TotEng Press 0 0 0 0 0 0 1 4.3224392e-09 -0.00090497076 0 -0.00090496429 7.8103559e-10 Loop time of 0.000505878 on 1 procs for 1 steps with 684 atoms

Performance: 170792167.047 tau/day, 1976.761 timesteps/s 62.5% CPU use with 1 MPI tasks x no OpenMP threads

MPI task timing breakdown: Section | min time | avg time | max time |%varavg| %total

Pair | 7.7427e-05 | 7.7427e-05 | 7.7427e-05 | 0.0 | 15.31 Bond | 3.65e-07 | 3.65e-07 | 3.65e-07 | 0.0 | 0.07 Neigh | 0.00032482 | 0.00032482 | 0.00032482 | 0.0 | 64.21 Comm | 1.5579e-05 | 1.5579e-05 | 1.5579e-05 | 0.0 | 3.08 Output | 1.6603e-05 | 1.6603e-05 | 1.6603e-05 | 0.0 | 3.28 Modify | 5.93e-05 | 5.93e-05 | 5.93e-05 | 0.0 | 11.72 Other | | 1.179e-05 | | | 2.33

Nlocal: 684 ave 684 max 684 min Histogram: 1 0 0 0 0 0 0 0 0 0 Nghost: 189 ave 189 max 189 min Histogram: 1 0 0 0 0 0 0 0 0 0 Neighs: 3473 ave 3473 max 3473 min Histogram: 1 0 0 0 0 0 0 0 0 0

Total # of neighbors = 3473 Ave neighs/atom = 5.07749 Ave special neighs/atom = 0 Neighbor list builds = 1 Dangerous builds not checked Total wall time: 0:00:00

TJFord commented 5 years ago

looks like it is working now @czbebe

ghost commented 4 years ago

Hello @TJFord @czbebe @nbuzwh

I am actually getting the exact same error as nbuzwh: The MPI_Finalize() function was called after MPI_FINALIZE was invoked. This is disallowed by the MPI standard. *** Your MPI job will now abort

I am wondering if there was ever a fix to this problem? I am using CentOS/openmpi3. I believe i have installed everything correctly and have the correct directory paths. @czbebe did you say you got this error fixed with: fix 3 all bond.create 1000 1 2 2.0 1 iparam 2 2 jparam 4 3 prob 0.5 23854. Is this a command in linux?

Thanks.

TJFord commented 4 years ago

Have you installed molecular packages from lammps?

On Fri, Jul 10, 2020 at 10:50 PM kgardner44 notifications@github.com wrote:

Hello @TJFord https://github.com/TJFord @czbebe https://github.com/czbebe @nbuzwh https://github.com/nbuzwh

I am actually getting the exact same error as nbuzwh: The MPI_Finalize() function was called after MPI_FINALIZE was invoked. This is disallowed by the MPI standard. *** Your MPI job will now abort

I am wondering if there was ever a fix to this problem? I am using CentOS/openmpi3. I believe i have installed everything correctly and have the correct directory paths. @czbebe https://github.com/czbebe did you say you got this error fixed with: fix 3 all bond.create 1000 1 2 2.0 1 iparam 2 2 jparam 4 3 prob 0.5 23854. Is this a command in linux?

Thanks.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/TJFord/palabos-lammps/issues/1#issuecomment-656981186, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABCQOXM3SEOVP2EBQ5C5OALR27OQPANCNFSM4GANY5UA .

-- Jifu Tan, PhD.

Assistant professor Department of Mechanical Engineering Northern Illinois University DeKalb, IL, 60115 Tel: 815-753-7250 https://sites.google.com/site/tanjifu/ https://scholar.google.com/citations?user=8hvGgDoAAAAJ&hl=en&authuser=1

ghost commented 4 years ago

Hello @TJFord ,

I have made sure I installed the molecular packages in lammps by typing make yes-molecule and make yes-mc command in the lammps src directory. I am actually now getting the following error:

Nx,Ny,Nz 21 21 81 dx 1 dt 0.00075 tau 0.95 Saving VTK file... could not open file ./tmp/vtk000010.vti Saving VTK file... could not open file ./tmp/vtk000020.vti Saving VTK file... could not open file ./tmp/vtk000030.vti Saving VTK file... could not open file ./tmp/vtk000040.vti Saving VTK file... could not open file ./tmp/vtk000050.vti Saving VTK file... could not open file ./tmp/vtk000060.vti Saving VTK file... could not open file ./tmp/vtk000070.vti Saving VTK file... could not open file ./tmp/vtk000080.vti Saving VTK file... could not open file ./tmp/vtk000090.vti total execution time 1.25534

Are you familiar with this error and what is happening here?

Thanks,

Karl Gardner

jrchang612 commented 1 year ago

Hello @TJFord I encounter exactly the same error as ghost.

mpirun ./embolism in.embolism

Nx,Ny,Nz 21 21 81 dx 1 dt 0.00075 tau 0.95 Saving VTK file... could not open file ./tmp/vtk000010.vti Saving VTK file... could not open file ./tmp/vtk000020.vti Saving VTK file... could not open file ./tmp/vtk000030.vti Saving VTK file... could not open file ./tmp/vtk000040.vti Saving VTK file... could not open file ./tmp/vtk000050.vti Saving VTK file... could not open file ./tmp/vtk000060.vti Saving VTK file... could not open file ./tmp/vtk000070.vti Saving VTK file... could not open file ./tmp/vtk000080.vti Saving VTK file... could not open file ./tmp/vtk000090.vti total execution time 1.18833

Any thoughts? Many thanks!

Ray Chang.

jrchang612 commented 1 year ago

It turns out I just need to create a /tmp/ folder for it and it saved.