firemodels / fds

Fire Dynamics Simulator
https://pages.nist.gov/fds-smv/
Other
664 stars 622 forks source link

Error while building FDS 6.2 with OpenMPI 1.10.0 and Intel Compiler (Parallel Studio XE 2016) #2670

Closed alexgby closed 9 years ago

alexgby commented 9 years ago

Hi,

I've followed the instructions from the wiki page to build FDS 6.2 on my linux cluster (Ubuntu 13.04 and CentOS 6.6) with latest Intel Compiler 2016 and OpenMPI 1.10.0.

I've installed the OpenMPI on /shared/openmpi_64 and run ./make_fds.sh under FDS_Compilation/mpi_intel_linux_64/ but I got these error messages:

Building FDS with the MPI distribution: /shared/openmpi_64
Building mpi_intel_linux_64
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  ../../FDS_Source/prec.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  ../../FDS_Source/cons.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  ../../FDS_Source/devc.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  -openmp -openmp-link static -liomp5 ../../FDS_Source/type.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  -openmp -openmp-link static -liomp5 ../../FDS_Source/mesh.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  -openmp -openmp-link static -liomp5 ../../FDS_Source/func.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  ../../FDS_Source/data.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  ../../FDS_Source/smvv.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  ../../FDS_Source/irad.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  -openmp -openmp-link static -liomp5 ../../FDS_Source/turb.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  ../../FDS_Source/soot.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  ../../FDS_Source/ieva.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  -openmp -openmp-link static -liomp5 ../../FDS_Source/pois.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  ../../FDS_Source/scrc.f90
../../FDS_Source/scrc.f90(1283): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_ALLREDUCE]
      CALL MPI_ALLREDUCE(S%XS,S%XS_MIN, 1,MPI_DOUBLE_PRECISION,MPI_MIN,MPI_COMM_WORLD,IERR)
-----------^
../../FDS_Source/scrc.f90(1285): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_ALLREDUCE]
      CALL MPI_ALLREDUCE(S%XF,S%XF_MAX, 1,MPI_DOUBLE_PRECISION,MPI_MAX,MPI_COMM_WORLD,IERR)
-----------^

...skip

../../FDS_Source/scrc.f90(11617): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_IRECV]
            CALL MPI_IRECV(OS%RECV_INTEGER(1),1,MPI_INTEGER,SNODE, &
-----------------^
/tmp/ifortas5xcK.i90(14164): catastrophic error: Too many errors, exiting
compilation aborted for ../../FDS_Source/scrc.f90 (code 1)
make: *** [scrc.o] Error 1

Is that because there are no mpi flag for scrc.f90?

mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  -openmp -openmp-link static -liomp5 ../../FDS_Source/pois.f90
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015  09:25:26\""  ../../FDS_Source/scrc.f90

Could someone give me a hand? Thank you!

gforney commented 9 years ago

Fds was built with openmpi 1.8.4 not 1.10.0 . looks fds is not compatible with 1.10.0 . try installing 1.8.4 and building with it. On Sep 1, 2015 5:38 AM, "Shuo-Hung Chen" notifications@github.com wrote:

Hi,

I've followed the instructions from the wiki page to build FDS 6.2 on my linux cluster (Ubuntu 13.04 and CentOS 6.6) with latest Intel Compiler 2016 and OpenMPI 1.10.0.

I've installed the OpenMPI on /shared/openmpi_64 and run ./make_fds.sh under FDS_Compilation/mpi_intel_linux_64/ but I got these error messages:

Building FDS with the MPI distribution: /shared/openmpi_64 Building mpi_intel_linux_64 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" ../../FDS_Source/prec.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" ../../FDS_Source/cons.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" ../../FDS_Source/devc.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" -openmp -openmp-link static -liomp5 ../../FDS_Source/type.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" -openmp -openmp-link static -liomp5 ../../FDS_Source/mesh.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" -openmp -openmp-link static -liomp5 ../../FDS_Source/func.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" ../../FDS_Source/data.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" ../../FDS_Source/smvv.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" ../../FDS_Source/irad.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" -openmp -openmp-link static -liomp5 ../../FDS_Source/turb.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" ../../FDS_Source/soot.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" ../../FDS_Source/ieva.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" -openmp -openmp-link static -liomp5 ../../FDS_Source/pois.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" ../../FDS_Source/scrc.f90 ../../FDS_Source/scrc.f90(1283): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%XS,S%XS_MIN, 1,MPI_DOUBLE_PRECISION,MPI_MIN,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(1285): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%XF,S%XF_MAX, 1,MPI_DOUBLE_PRECISION,MPI_MAX,MPI_COMM_WORLD,IERR) -----------^

...skip

../../FDS_Source/scrc.f90(11617): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_INTEGER(1),1,MPI_INTEGER,SNODE, & -----------------^ /tmp/ifortas5xcK.i90(14164): catastrophic error: Too many errors, exiting compilation aborted for ../../FDS_Source/scrc.f90 (code 1) make: *\ [scrc.o] Error 1

Is that because there are no mpi flag for scrc.f90?

mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" -openmp -openmp-link static -liomp5 ../../FDS_Source/pois.f90 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 01, 2015 09:25:26\"" ../../FDS_Source/scrc.f90

Could someone give me a hand? Thank you!

— Reply to this email directly or view it on GitHub https://github.com/firemodels/fds-smv/issues/2670.

mcgratta commented 9 years ago

We have not compiled yet with OpenMPI 1.10.0 or Intel Fortran 16 under linux. This might be a problem where the compiler is now more strict about the MPI call. I do not know if the problem is with OpenMPI or Intel Fortran.

Glenn -- can we install Fortran 16 on the cluster and see if we can compile.

mcgratta commented 9 years ago

Glenn just installed Intel Fortran 16 on our linux cluster:

mpifort -V Intel(R) Fortran Intel(R) 64 Compiler for applications running on Intel(R) 64, Version 16.0.0.109 Build 20150815 Copyright (C) 1985-2015 Intel Corporation. All rights reserved.

What do you get when you type:

mpifort --showme

alexgby commented 9 years ago

On my cluster:

$ mpifort -V
ntel(R) Fortran Intel(R) 64 Compiler for applications running on Intel(R) 64, Version 16.0.0.109 Build 20150815
Copyright (C) 1985-2015 Intel Corporation.  All rights reserved.

After I recompiled OpenMPI 1.8.4:

$ mpirun -version
mpirun (Open MPI) 1.8.4

Report bugs to http://www.open-mpi.org/community/help/

I still cannot compile the latest FDS-SMV from git clone Error is the same:

Building FDS with the MPI distribution: /shared/openmpi_64
Building mpi_intel_linux_64
mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 02, 2015  04:10:39\""  ../../FDS_Source/scrc.f90
../../FDS_Source/scrc.f90(1283): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_ALLREDUCE]
      CALL MPI_ALLREDUCE(S%XS,S%XS_MIN, 1,MPI_DOUBLE_PRECISION,MPI_MIN,MPI_COMM_WORLD,IERR)
-----------^
../../FDS_Source/scrc.f90(1285): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_ALLREDUCE]
      CALL MPI_ALLREDUCE(S%XF,S%XF_MAX, 1,MPI_DOUBLE_PRECISION,MPI_MAX,MPI_COMM_WORLD,IERR)
-----------^
../../FDS_Source/scrc.f90(1288): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_ALLREDUCE]
      CALL MPI_ALLREDUCE(S%YS,S%YS_MIN, 1,MPI_DOUBLE_PRECISION,MPI_MIN,MPI_COMM_WORLD,IERR)
-----------^
../../FDS_Source/scrc.f90(1290): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_ALLREDUCE]
      CALL MPI_ALLREDUCE(S%YF,S%YF_MAX, 1,MPI_DOUBLE_PRECISION,MPI_MAX,MPI_COMM_WORLD,IERR)
-----------^
../../FDS_Source/scrc.f90(1293): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_ALLREDUCE]
      CALL MPI_ALLREDUCE(S%ZS,S%ZS_MIN, 1,MPI_DOUBLE_PRECISION,MPI_MIN,MPI_COMM_WORLD,IERR)
-----------^
../../FDS_Source/scrc.f90(1295): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_ALLREDUCE]
      CALL MPI_ALLREDUCE(S%ZF,S%ZF_MAX, 1,MPI_DOUBLE_PRECISION,MPI_MAX,MPI_COMM_WORLD,IERR)
-----------^
../../FDS_Source/scrc.f90(3199): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_ALLREDUCE]
      CALL MPI_ALLREDUCE(NC_GROUP(MYID+1,NL),NC_GLOBAL(NL),1,MPI_INTEGER,MPI_SUM,MPI_COMM_WORLD,IERR)
-----------^
../../FDS_Source/scrc.f90(3231): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_SEND]
               CALL MPI_SEND(OS%I_MIN_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR)
--------------------^
../../FDS_Source/scrc.f90(3232): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_SEND]
               CALL MPI_SEND(OS%I_MAX_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR)
--------------------^
../../FDS_Source/scrc.f90(3233): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_SEND]
               CALL MPI_SEND(OS%J_MIN_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR)
--------------------^
../../FDS_Source/scrc.f90(3234): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_SEND]
               CALL MPI_SEND(OS%J_MAX_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR)
--------------------^
../../FDS_Source/scrc.f90(3235): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_SEND]
               CALL MPI_SEND(OS%K_MIN_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR)
--------------------^
../../FDS_Source/scrc.f90(3236): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_SEND]
               CALL MPI_SEND(OS%K_MAX_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR)
--------------------^
../../FDS_Source/scrc.f90(3237): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_SEND]
               CALL MPI_SEND(OS%NIC_R,  1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR)
--------------------^
../../FDS_Source/scrc.f90(3259): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_RECV]
               CALL MPI_RECV(OSO%I_MIN_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR)
--------------------^
../../FDS_Source/scrc.f90(3260): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_RECV]
               CALL MPI_RECV(OSO%I_MAX_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR)
--------------------^
../../FDS_Source/scrc.f90(3261): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_RECV]
               CALL MPI_RECV(OSO%J_MIN_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR)
--------------------^
../../FDS_Source/scrc.f90(3262): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_RECV]
               CALL MPI_RECV(OSO%J_MAX_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR)
--------------------^
../../FDS_Source/scrc.f90(3263): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_RECV]
               CALL MPI_RECV(OSO%K_MIN_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR)
--------------------^
../../FDS_Source/scrc.f90(3264): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_RECV]
               CALL MPI_RECV(OSO%K_MAX_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR)
--------------------^
../../FDS_Source/scrc.f90(3265): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_RECV]
               CALL MPI_RECV(OSO%NIC_S,  1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR)
--------------------^
../../FDS_Source/scrc.f90(6421): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_ALLREDUCE]
               CALL MPI_ALLREDUCE (IGRAPH, IGRAPH_GLOBAL, 1, MPI_INTEGER, MPI_SUM, MPI_COMM_WORLD, IERR)
--------------------^
../../FDS_Source/scrc.f90(9538): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_ALLREDUCE]
   CALL MPI_ALLREDUCE (SP_GROUP(MYID+1), SP_GLOBAL, 1, MPI_DOUBLE_PRECISION, MPI_SUM, MPI_COMM_WORLD, IERR)
--------^
../../FDS_Source/scrc.f90(9590): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_ALLREDUCE]
   CALL MPI_ALLREDUCE (SP_GROUP(MYID+1), SP_GLOBAL, 1, MPI_DOUBLE_PRECISION, MPI_SUM, MPI_COMM_WORLD, IERR)
--------^
../../FDS_Source/scrc.f90(11570): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_IRECV]
            CALL MPI_IRECV(OS%RECV_INTEGER0(1),1,MPI_INTEGER,SNODE, &
-----------------^
../../FDS_Source/scrc.f90(11581): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_IRECV]
            CALL MPI_IRECV(OS%RECV_INTEGER(1),SIZE(OS%RECV_INTEGER),MPI_INTEGER,SNODE, &
-----------------^
../../FDS_Source/scrc.f90(11590): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_IRECV]
            CALL MPI_IRECV(OS%RECV_INTEGER(1),5,MPI_INTEGER,SNODE, &
-----------------^
../../FDS_Source/scrc.f90(11599): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_IRECV]
            CALL MPI_IRECV(OS%RECV_REAL0(1),1,MPI_DOUBLE_PRECISION,SNODE, &
-----------------^
../../FDS_Source/scrc.f90(11608): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_IRECV]
            CALL MPI_IRECV(OS%RECV_INTEGER(1),9,MPI_INTEGER,SNODE, &
-----------------^
../../FDS_Source/scrc.f90(11617): error #6285: There is no matching specific subroutine for this generic subroutine call.   [MPI_IRECV]
            CALL MPI_IRECV(OS%RECV_INTEGER(1),1,MPI_INTEGER,SNODE, &
-----------------^
/tmp/ifortYavB79.i90(14164): catastrophic error: Too many errors, exiting
compilation aborted for ../../FDS_Source/scrc.f90 (code 1)
make: *** [scrc.o] Error 1
gforney commented 9 years ago

Looks like you are still pointing to the 1.10.0 distribution. On Sep 2, 2015 12:13 AM, "Shuo-Hung Chen" notifications@github.com wrote:

On my cluster:

$ mpifort -V ntel(R) Fortran Intel(R) 64 Compiler for applications running on Intel(R) 64, Version 16.0.0.109 Build 20150815 Copyright (C) 1985-2015 Intel Corporation. All rights reserved.

After I recompiled OpenMPI 1.8.4:

$ mpirun -version mpirun (Open MPI) 1.8.4

Report bugs to http://www.open-mpi.org/community/help/

I still cannot compile the latest FDS-SMV from git clone Error is the same:

Building FDS with the MPI distribution: /shared/openmpi_64 Building mpi_intel_linux_64 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 02, 2015 04:10:39\"" ../../FDS_Source/scrc.f90 ../../FDS_Source/scrc.f90(1283): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%XS,S%XS_MIN, 1,MPI_DOUBLE_PRECISION,MPI_MIN,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(1285): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%XF,S%XF_MAX, 1,MPI_DOUBLE_PRECISION,MPI_MAX,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(1288): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%YS,S%YS_MIN, 1,MPI_DOUBLE_PRECISION,MPI_MIN,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(1290): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%YF,S%YF_MAX, 1,MPI_DOUBLE_PRECISION,MPI_MAX,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(1293): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%ZS,S%ZS_MIN, 1,MPI_DOUBLE_PRECISION,MPI_MIN,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(1295): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%ZF,S%ZF_MAX, 1,MPI_DOUBLE_PRECISION,MPI_MAX,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(3199): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(NC_GROUP(MYID+1,NL),NC_GLOBAL(NL),1,MPI_INTEGER,MPI_SUM,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(3231): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%I_MIN_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3232): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%I_MAX_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3233): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%J_MIN_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3234): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%J_MAX_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3235): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%K_MIN_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3236): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%K_MAX_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3237): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%NIC_R, 1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3259): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%I_MIN_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(3260): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%I_MAX_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(3261): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%J_MIN_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(3262): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%J_MAX_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(3263): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%K_MIN_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(3264): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%K_MAX_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(3265): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%NIC_S, 1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(6421): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE (IGRAPH, IGRAPH_GLOBAL, 1, MPI_INTEGER, MPI_SUM, MPI_COMM_WORLD, IERR) --------------------^ ../../FDS_Source/scrc.f90(9538): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE (SP_GROUP(MYID+1), SP_GLOBAL, 1, MPI_DOUBLE_PRECISION, MPI_SUM, MPI_COMM_WORLD, IERR) --------^ ../../FDS_Source/scrc.f90(9590): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE (SP_GROUP(MYID+1), SP_GLOBAL, 1, MPI_DOUBLE_PRECISION, MPI_SUM, MPI_COMM_WORLD, IERR) --------^ ../../FDS_Source/scrc.f90(11570): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_INTEGER0(1),1,MPI_INTEGER,SNODE, & -----------------^ ../../FDS_Source/scrc.f90(11581): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_INTEGER(1),SIZE(OS%RECV_INTEGER),MPI_INTEGER,SNODE, & -----------------^ ../../FDS_Source/scrc.f90(11590): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_INTEGER(1),5,MPI_INTEGER,SNODE, & -----------------^ ../../FDS_Source/scrc.f90(11599): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_REAL0(1),1,MPI_DOUBLE_PRECISION,SNODE, & -----------------^ ../../FDS_Source/scrc.f90(11608): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_INTEGER(1),9,MPI_INTEGER,SNODE, & -----------------^ ../../FDS_Source/scrc.f90(11617): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_INTEGER(1),1,MPI_INTEGER,SNODE, & -----------------^ /tmp/ifortYavB79.i90(14164): catastrophic error: Too many errors, exiting compilation aborted for ../../FDS_Source/scrc.f90 (code 1) make: *\ [scrc.o] Error 1

— Reply to this email directly or view it on GitHub https://github.com/firemodels/fds-smv/issues/2670#issuecomment-136933169 .

alexgby commented 9 years ago

Dear @gforney and @mcgratta,

After I built the OpenMPI 1.8.4 with Intel Parallel Studio XE 2015 Update3, I can now compile the mpi_intel_linux_64 on my clusters!

Really appreciate your help!!

lu-kas commented 9 years ago

What is the output of

$ mpiifort -show

?

On 02 Sep 2015, at 11:30, gforney notifications@github.com wrote:

Looks like you are still pointing to the 1.10.0 distribution. On Sep 2, 2015 12:13 AM, "Shuo-Hung Chen" notifications@github.com wrote:

On my cluster:

$ mpifort -V ntel(R) Fortran Intel(R) 64 Compiler for applications running on Intel(R) 64, Version 16.0.0.109 Build 20150815 Copyright (C) 1985-2015 Intel Corporation. All rights reserved.

After I recompiled OpenMPI 1.8.4:

$ mpirun -version mpirun (Open MPI) 1.8.4

Report bugs to http://www.open-mpi.org/community/help/

I still cannot compile the latest FDS-SMV from git clone Error is the same:

Building FDS with the MPI distribution: /shared/openmpi_64 Building mpi_intel_linux_64 mpifort -c -m64 -O2 -ipo -traceback -fpp -DGITHASH_PP=\"Git-571-g01e82ba-dirty\" -DGITDATE_PP=\""Mon Aug 31 17:13:38 2015 -0400\"" -DBUILDDATE_PP=\""Sep 02, 2015 04:10:39\"" ../../FDS_Source/scrc.f90 ../../FDS_Source/scrc.f90(1283): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%XS,S%XS_MIN, 1,MPI_DOUBLE_PRECISION,MPI_MIN,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(1285): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%XF,S%XF_MAX, 1,MPI_DOUBLE_PRECISION,MPI_MAX,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(1288): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%YS,S%YS_MIN, 1,MPI_DOUBLE_PRECISION,MPI_MIN,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(1290): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%YF,S%YF_MAX, 1,MPI_DOUBLE_PRECISION,MPI_MAX,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(1293): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%ZS,S%ZS_MIN, 1,MPI_DOUBLE_PRECISION,MPI_MIN,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(1295): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(S%ZF,S%ZF_MAX, 1,MPI_DOUBLE_PRECISION,MPI_MAX,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(3199): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE(NC_GROUP(MYID+1,NL),NC_GLOBAL(NL),1,MPI_INTEGER,MPI_SUM,MPI_COMM_WORLD,IERR) -----------^ ../../FDS_Source/scrc.f90(3231): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%I_MIN_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3232): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%I_MAX_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3233): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%J_MIN_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3234): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%J_MAX_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3235): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%K_MIN_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3236): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%K_MAX_R,1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3237): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_SEND] CALL MPI_SEND(OS%NIC_R, 1,MPI_INTEGER,PROCESS(NOM),1,MPI_COMM_WORLD,IERR) --------------------^ ../../FDS_Source/scrc.f90(3259): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%I_MIN_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(3260): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%I_MAX_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(3261): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%J_MIN_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(3262): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%J_MAX_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(3263): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%K_MIN_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(3264): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%K_MAX_S,1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(3265): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_RECV] CALL MPI_RECV(OSO%NIC_S, 1,MPI_INTEGER,PROCESS(NM),1,MPI_COMM_WORLD,STATUS2_SCARC,IERR) --------------------^ ../../FDS_Source/scrc.f90(6421): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE (IGRAPH, IGRAPH_GLOBAL, 1, MPI_INTEGER, MPI_SUM, MPI_COMM_WORLD, IERR) --------------------^ ../../FDS_Source/scrc.f90(9538): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE (SP_GROUP(MYID+1), SP_GLOBAL, 1, MPI_DOUBLE_PRECISION, MPI_SUM, MPI_COMM_WORLD, IERR) --------^ ../../FDS_Source/scrc.f90(9590): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_ALLREDUCE] CALL MPI_ALLREDUCE (SP_GROUP(MYID+1), SP_GLOBAL, 1, MPI_DOUBLE_PRECISION, MPI_SUM, MPI_COMM_WORLD, IERR) --------^ ../../FDS_Source/scrc.f90(11570): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_INTEGER0(1),1,MPI_INTEGER,SNODE, & -----------------^ ../../FDS_Source/scrc.f90(11581): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_INTEGER(1),SIZE(OS%RECV_INTEGER),MPI_INTEGER,SNODE, & -----------------^ ../../FDS_Source/scrc.f90(11590): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_INTEGER(1),5,MPI_INTEGER,SNODE, & -----------------^ ../../FDS_Source/scrc.f90(11599): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_REAL0(1),1,MPI_DOUBLE_PRECISION,SNODE, & -----------------^ ../../FDS_Source/scrc.f90(11608): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_INTEGER(1),9,MPI_INTEGER,SNODE, & -----------------^ ../../FDS_Source/scrc.f90(11617): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_IRECV] CALL MPI_IRECV(OS%RECV_INTEGER(1),1,MPI_INTEGER,SNODE, & -----------------^ /tmp/ifortYavB79.i90(14164): catastrophic error: Too many errors, exiting compilation aborted for ../../FDS_Source/scrc.f90 (code 1) make: *\ [scrc.o] Error 1

— Reply to this email directly or view it on GitHub https://github.com/firemodels/fds-smv/issues/2670#issuecomment-136933169 .

— Reply to this email directly or view it on GitHub.

alexgby commented 9 years ago
$ mpifort --version
ifort (IFORT) 15.0.3 20150407
Copyright (C) 1985-2015 Intel Corporation.  All rights reserved.
$ mpifort -show
ifort -I/shared/openmpi_64/include -I/shared/openmpi_64/lib -Wl,-rpath -Wl,/shared/openmpi_64/lib -Wl,--enable-new-dtags -L/shared/openmpi_64/lib -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi
lu-kas commented 9 years ago

but it works now, right?

On 02 Sep 2015, at 11:39, Shuo-Hung Chen notifications@github.com wrote:

$ mpifort --version ifort (IFORT) 15.0.3 20150407 Copyright (C) 1985-2015 Intel Corporation. All rights reserved.

$ mpifort -show ifort -I/shared/openmpi_64/include -I/shared/openmpi_64/lib -Wl,-rpath -Wl,/shared/openmpi_64/lib -Wl,--enable-new-dtags -L/shared/openmpi_64/lib -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi

— Reply to this email directly or view it on GitHub.

alexgby commented 9 years ago

Yes!

alexgby commented 9 years ago

But I still can't find the right options for the mpirun.

My cluster have two compute nodes: master and node001. Each have four CPU:

$ cat /etc/hosts
127.0.0.1 localhost

# The following lines are desirable for IPv6 capable hosts
::1 ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
ff02::3 ip6-allhosts
172.31.20.104 master
172.31.20.105 node001

$ cat /proc/cpuinfo 
processor   : 0
vendor_id   : GenuineIntel
cpu family  : 6
model       : 62
model name  : Intel(R) Xeon(R) CPU E5-2680 v2 @ 2.80GHz
stepping    : 4
microcode   : 0x415
cpu MHz     : 2793.340
cache size  : 25600 KB
physical id : 0
siblings    : 1
core id     : 0
cpu cores   : 1
apicid      : 0
initial apicid  : 0
fpu     : yes
fpu_exception   : yes
cpuid level : 13
wp      : yes
flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx rdtscp lm constant_tsc rep_good nopl xtopology eagerfpu pni pclmulqdq ssse3 cx16 pcid sse4_1 sse4_2 x2apic popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm xsaveopt fsgsbase smep erms
bogomips    : 5586.68
clflush size    : 64
cache_alignment : 64
address sizes   : 46 bits physical, 48 bits virtual
power management:

processor   : 1
vendor_id   : GenuineIntel
cpu family  : 6
model       : 62
model name  : Intel(R) Xeon(R) CPU E5-2680 v2 @ 2.80GHz
stepping    : 4
microcode   : 0x415
cpu MHz     : 2793.340
cache size  : 25600 KB
physical id : 0
siblings    : 1
core id     : 1
cpu cores   : 1
apicid      : 2
initial apicid  : 2
fpu     : yes
fpu_exception   : yes
cpuid level : 13
wp      : yes
flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx rdtscp lm constant_tsc rep_good nopl xtopology eagerfpu pni pclmulqdq ssse3 cx16 pcid sse4_1 sse4_2 x2apic popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm xsaveopt fsgsbase smep erms
bogomips    : 5586.68
clflush size    : 64
cache_alignment : 64
address sizes   : 46 bits physical, 48 bits virtual
power management:

processor   : 2
vendor_id   : GenuineIntel
cpu family  : 6
model       : 62
model name  : Intel(R) Xeon(R) CPU E5-2680 v2 @ 2.80GHz
stepping    : 4
microcode   : 0x415
cpu MHz     : 2793.340
cache size  : 25600 KB
physical id : 0
siblings    : 1
core id     : 0
cpu cores   : 0
apicid      : 1
initial apicid  : 1
fpu     : yes
fpu_exception   : yes
cpuid level : 13
wp      : yes
flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx rdtscp lm constant_tsc rep_good nopl xtopology eagerfpu pni pclmulqdq ssse3 cx16 pcid sse4_1 sse4_2 x2apic popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm xsaveopt fsgsbase smep erms
bogomips    : 5586.68
clflush size    : 64
cache_alignment : 64
address sizes   : 46 bits physical, 48 bits virtual
power management:

processor   : 3
vendor_id   : GenuineIntel
cpu family  : 6
model       : 62
model name  : Intel(R) Xeon(R) CPU E5-2680 v2 @ 2.80GHz
stepping    : 4
microcode   : 0x415
cpu MHz     : 2793.340
cache size  : 25600 KB
physical id : 0
siblings    : 1
core id     : 1
cpu cores   : 0
apicid      : 3
initial apicid  : 3
fpu     : yes
fpu_exception   : yes
cpuid level : 13
wp      : yes
flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx rdtscp lm constant_tsc rep_good nopl xtopology eagerfpu pni pclmulqdq ssse3 cx16 pcid sse4_1 sse4_2 x2apic popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm xsaveopt fsgsbase smep erms
bogomips    : 5586.68
clflush size    : 64
cache_alignment : 64
address sizes   : 46 bits physical, 48 bits virtual
power management:
$ /shared/openmpi_64/bin/mpirun -np 4 /home/sgeadmin/FDS-SMV/FDS_Compilation/mpi_intel_linux_64/fds_mpi_intel_linux_64  VTT_01.fds 
--------------------------------------------------------------------------
A request for multiple cpus-per-proc was given, but a directive
was also give to map to an object level that has less cpus than
requested ones:

  #cpus-per-proc:  1
  number of cpus:  0
  map-by:          BYSOCKET

Please specify a mapping level that has more cpus, or else let us
define a default mapping that will allow multiple cpus-per-proc.
--------------------------------------------------------------------------

Currently I don't have a queuing system on my cluster. What mpirun options should I use for this case?

Thanks!

mcgratta commented 9 years ago

Try creating a hostfile. Call it hostfile.txt and add to it

node001 slots=... node002 slots=...

Then

mpirun -np 4 --hostfile hostfile.txt ... (the rest of your command)

alexgby commented 9 years ago

I've create the hosfile and set slots=4 for both nodes, but I still can run it...

sgeadmin@master:~/parking/1$ cat hostfile.txt 
master slots=4
node001 slots=4
sgeadmin@master:~/parking/1$ mpirun --hostfile hostfile.txt fds_mpi_intel_linux_64 parking_2_meshx8.fds
--------------------------------------------------------------------------
mpirun noticed that process rank 2 with PID 28105 on node master exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
sgeadmin@master:~/parking/1$ mpirun --hostfile hostfile.txt -np 8 fds_mpi_intel_linux_64 parking_2_meshx8.fds
--------------------------------------------------------------------------
A request for multiple cpus-per-proc was given, but a directive
was also give to map to an object level that has less cpus than
requested ones:

  #cpus-per-proc:  1
  number of cpus:  0
  map-by:          BYSOCKET

Please specify a mapping level that has more cpus, or else let us
define a default mapping that will allow multiple cpus-per-proc.
--------------------------------------------------------------------------
sgeadmin@master:~/parking/1$ mpirun --hostfile hostfile.txt -np 6 fds_mpi_intel_linux_64 parking_2_meshx8.fds
--------------------------------------------------------------------------
A request for multiple cpus-per-proc was given, but a directive
was also give to map to an object level that has less cpus than
requested ones:

  #cpus-per-proc:  1
  number of cpus:  0
  map-by:          BYSOCKET

Please specify a mapping level that has more cpus, or else let us
define a default mapping that will allow multiple cpus-per-proc.
--------------------------------------------------------------------------
sgeadmin@master:~/parking/1$ mpirun --hostfile hostfile.txt -np 4 fds_mpi_intel_linux_64 parking_2_meshx8.fds
--------------------------------------------------------------------------
A request for multiple cpus-per-proc was given, but a directive
was also give to map to an object level that has less cpus than
requested ones:

  #cpus-per-proc:  1
  number of cpus:  0
  map-by:          BYSOCKET

Please specify a mapping level that has more cpus, or else let us
define a default mapping that will allow multiple cpus-per-proc.
--------------------------------------------------------------------------
sgeadmin@master:~/parking/1$ mpirun --hostfile hostfile.txt -np 2 fds_mpi_intel_linux_64 parking_2_meshx8.fds
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 28123 on node master exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
mcgratta commented 9 years ago

What happens if you just type

fds_mpi_intel_linux_64 parking_2_meshx8.fds

alexgby commented 9 years ago

i got segmentation fault

$ fds_mpi_intel_linux_64 scale1.fds 
Segmentation fault (core dumped)

When I followed the wiki steps, I got these warnings:

04, 2015  03:53:57\""  -static-intel -openmp -openmp-link static -liomp5 -o fds_mpi_intel_linux_64 prec.o cons.o devc.o data.o type.o mesh.o func.o smvv.o irad.o turb.o soot.o ieva.o pois.o scrc.o radi.o evac.o gsmv.o geom.o part.o vege.o ctrl.o samr.o dump.o hvac.o mass.o read.o wall.o fire.o divg.o velo.o pres.o init.o main.o
ipo: warning #11021: unresolved opal_argv_free
        Referenced in /shared/openmpi_64/lib/libmpi_mpifh.so
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_list_item_t_class
        Referenced in /shared/openmpi_64/lib/libmpi_mpifh.so
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_class_initialize
        Referenced in /shared/openmpi_64/lib/libmpi_mpifh.so
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved orte_standalone_operation
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved mca_base_var_register_synonym
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_install_dirs
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved orte_finalize
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved mca_base_pvar_handle_reset
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved mca_base_var_register
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_output_stream_t_class
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved orte_rml
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_hwloc_topology
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_list_t_class
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_local_arch
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_info_register_framework_params
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved orte_grpcomm
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_getpagesize
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved mca_base_var_group_get
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_convertor_prepare_for_send
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved mca_base_pvar_handle_stop
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved mca_base_var_get_count
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved mca_base_component_close
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_object_t_class
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved mca_base_framework_components_close
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved orte_name_wildcard
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_list_sort
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved mca_base_pvar_get
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved mca_base_pvar_handle_write_value
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved mca_base_select
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_condition_t_class
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_mem_hooks_support_level
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_datatype_create_desc
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved orte_process_info
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_atomic_lifo_t_class
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_pointer_array_test_and_set_item
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved mca_base_var_group_get_count
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved orte_rml_send_callback
        Referenced in /shared/openmpi_64/lib/libmpi.so
ipo: warning #11021: unresolved opal_datatype_contain_basic_datatypes
        Referenced in /shared/openmp
mcgratta commented 9 years ago

What do you mean "wiki steps"?

mcgratta commented 9 years ago

Have you made any progress on this issue?

alexgby commented 9 years ago

The wiki steps is here: https://github.com/firemodels/fds-smv/wiki/FDS-Compilation

I can build OpenMPI with Intel Compiler 2015 update 3 and FDS with compiled OpenMPI. But the built FDS executable get segmentation fault every time even for validation cases.

Currently, I can compile FDS with Intel MPI and successfully executed the program with SGE (Sun Grid Engine) with MIT StarCluster environment.

I will try to find the reasons for segmentation fault of oepnmpi version of the compiled FDS executable.

Many thanks for your helps!