GridOPTICS / GridPACK

https://www.gridpack.org/
44 stars 22 forks source link

test failed #8

Closed yhwu closed 7 years ago

yhwu commented 7 years ago

Hi, I managed to build GridPack on ubuntu following your instructions and making some tweaks. However, some tests failed. I wonder if you could venture a guess and point a direction to fix it. The failed tests were:

80% tests passed, 17 tests failed out of 86

Total Test time (real) =  97.74 sec

The following tests FAILED:
         30 - real_linear_solver_serial (Failed)
         31 - real_linear_solver_parallel (Failed)
         32 - complex_linear_solver_serial (Failed)
         33 - complex_linear_solver_parallel (Failed)
         58 - powerflow_serial (OTHER_FAULT)
         59 - powerflow_parallel (Failed)
         60 - dynamic_simulation_serial (OTHER_FAULT)
         61 - dynamic_simulation_parallel (Failed)
         62 - dynamic_simulation_full_y_serial (OTHER_FAULT)
         63 - dynamic_simulation_full_y_parallel (Failed)
         68 - resistor_grid_serial (OTHER_FAULT)
         69 - resistor_grid_parallel (Failed)
         71 - pf_test_parallel (Failed)
         72 - ds_test_serial (OTHER_FAULT)
         73 - ds_test_parallel (Failed)
         74 - dsf_test_serial (OTHER_FAULT)
         75 - dsf_test_parallel (Failed)
Errors while running CTest
Makefile:105: recipe for target 'test' failed
make: *** [test] Error 8

In order to build gridpack, I also had to use --enable-shared=yes when compiling GA. Other than that, I followed the rest of the instruction for ubuntu.

Thank you very much! Yinghua

wperkins commented 7 years ago

Yinghua,

Thanks for the feedback. I'd be interested in the "tweaks" you needed. The Ubuntu build instructions definitely need updating.

The linear solver and resistor test failures are disconcerting. The others are likely due to the PETSc you used. The power flow and dynamic simulation tests fail because SuperLU_DIST is not included in the stock Ubuntu PETSc package. The dynamic simulation tests, in particular, like parallel LU decomposition and just won't solve with iterative methods.

I built GridPACK out of the box (Github master) on Ubuntu 16.04 LTS, but used a custom built PETSc. Here's how I built PETSc:

prefix="$HOME/gridpack"
PETSC_DIR="$prefix/petsc-3.7.5"
export PETSC_DIR
./configure \
    PETSC_ARCH=arch-ubuntu-real-opt \
    --with-prefix="$prefix" \
    --with-mpi=1 \
    --with-cc=mpicc \
    --with-fc=mpif90 \
    --with-cxx=mpicxx \
    --with-clanguage=c++ \
    --with-c++-support=1 \
    --with-cxx-dialect=C++11 \
    --CXX_CXXFLAGS=-std=gnu++11 \
    --with-c-support=0 \
    --with-fortran=1 \
    --with-pthread=0 \
    --with-scalar-type=real \
    --with-fortran-kernels=generic \
    --with-parmetis=1 \
    --download-parmetis=1 \
    --with-metis=1 \
    --download-metis=1 \
    --with-superlu_dist=1 \
    --download-superlu_dist=1 \
    --with-blas-lapack-dir=/usr \
    --with-suitesparse=1 \
    --download-suitesparse=1 \
    --with-mumps=0 \
    --with-scalapack=0 \
    --with-shared-libraries=0 \
    --with-x=0 \
    --with-mpirun=mpirun \
    --with-mpiexec=mpiexec \
    --with-debugging=0
make PETSC_DIR=/home/gridpack/gridpack/petsc-3.7.5 PETSC_ARCH=arch-ubuntu-real-opt all

The GridPACK configuration is in src/example_configuration.sh for host gridpackvm. I'll try to get these things included in the build instructions.

I would recommend building PETSc yourself if you need parallel LU decomposition. Let us know if you try.

Thanks. Bill

yhwu commented 7 years ago

Thanks! All tests passed!

yhwu commented 7 years ago

I used the following to compile GA, reverting backward to avoid F77_INTEL.. error.

git clone https://github.com/GlobalArrays/ga.git
cd ga
git revert --no-edit cb162c97eae3817d38b5c3d01126d710c7cd3e98
./autogen.sh
prefix="/home/gridpack/gridpack"
./configure  --enable-cxx \
     --enable-i4 \
     --disable-f77 \
     --with-mpi \
     --prefix="$prefix" \
     --with-blas=no \
     --with-lapack=no \
     --enable-shared=yes \
     --enable-static=yes \
     MPICC=mpicc MPICXX=mpicxx MPIF77=mpif90 \
     MPIEXEC=mpiexec MPIRUN=mpirun