TangLaoya / parafem

Automatically exported from code.google.com/p/parafem
0 stars 0 forks source link

Fatal error when running the example problem on windows 7 #7

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?
1.Just running the example problem 

What is the expected output? What do you see instead?

Fatal error when running the example problems. 
saying
PE no:        1 nels_pp:     2560
  4. Allocate dynamic arrays used in main program
PE no:        1 neq_pp:    36720
No. of elements of PE    1 required by PE    1:    36720
 Total number of unique elements required
 i.e. length of pl_pp required:        36720
PE:    1 Number of remote PEs required:        0
PE:    1 Accesses - remote, local, remote/local:      0 36720    0.00 From      
0 PEs
job aborted:
rank: node: exit code[: error message]
0: Shaiklp: 1: Fatal error in PMPI_Reduce: Invalid MPI_Op, error stack:
PMPI_Reduce(1198).......: MPI_Reduce(sbuf=000000000012EA90, 
rbuf=000000000012EAD0, count=1, dtype=0x4c000829, MPI_SUM, root=0, 
MPI_COMM_WORLD) failed
MPIR_SUM_check_dtype(97): MPI_Op MPI_SUM operation not defined for this 
datatype 
What version of the product are you using? On what operating system?
ParaFEM_r40 on Windows 7 with Microsoft Visual Studio 2008 integrated in Intel 
Fortran 11.

Please provide any additional information below.
I had noticed this error is pointing to the MPI_Reduce command in the 
gather_scatter.f90. the rem_loc,sum_rem_loc doesnt are not properly assigned.

Original issue reported on code.google.com by skrav...@gmail.com on 14 Mar 2011 at 5:41

GoogleCodeExporter commented 9 years ago
The Linux version had no problem....

Original comment by skrav...@gmail.com on 14 Mar 2011 at 9:44

GoogleCodeExporter commented 9 years ago
Thank you for posting the problem. What version of MPI are you running under 
Windows 7?

Original comment by drmarge...@gmail.com on 18 Mar 2011 at 1:10

GoogleCodeExporter commented 9 years ago
MPICH2

Original comment by skrav...@gmail.com on 19 Mar 2011 at 5:12

GoogleCodeExporter commented 9 years ago
I'd like to move this discussion to the ParaFEM Forum. If you have not 
registered already, would you mind doing so? Many thanks, Lee.

http://www.parafem.org.uk/

Original comment by drmarge...@gmail.com on 21 Mar 2011 at 2:09

GoogleCodeExporter commented 9 years ago
It correctly sees the operation MPI_SUM but It looks like the MPI datatype is 
wrong

"dtype=0x4c000829"

that would seem to indicate that the 5th argument to MPI_REDUCE() is corrupt 
0x4c000829 = 1275070505 decimal.
In Intel MPI this is MPI_REAL8 
    PARAMETER (MPI_REAL8=1275070505)

This is valid so I would guess that the Internals of the MPI library are 
getting mixed up between Fortrac and C bindings

Original comment by daniel.k...@gmail.com on 15 Jun 2011 at 10:19

GoogleCodeExporter commented 9 years ago
And note that the target PE of the reduction is the last PE. 99% of code use 
MPI_reduce to give PE=0 the answer, so this case may well have not been 
properly tested by the developers.

Original comment by daniel.k...@gmail.com on 15 Jun 2011 at 10:24