dftbplus / dftbplus

DFTB+ general package for performing fast atomistic simulations
http://www.dftbplus.org
Other
330 stars 164 forks source link

All tests fail with the TODO message #1271

Closed yurivict closed 1 year ago

yurivict commented 1 year ago

Describe the bug

  7/274 Test   #7: dftb+_extpot/CH4_scc_net ..................***Failed    0.65 sec
extpot/CH4_scc_net:  TODO.

==============================================================================
TEST SUMMARY
------------------------------------------------------------------------------
Match:
    None

Not run:
    extpot/CH4_scc_net
------------------------------------------------------------------------------
Status: FAIL

        Start   8: dftb+_deltadftb/indigo
  8/274 Test   #8: dftb+_deltadftb/indigo ....................***Failed    0.75 sec
deltadftb/indigo:    TODO.

==============================================================================
TEST SUMMARY
------------------------------------------------------------------------------
Match:
    None

Not run:
    deltadftb/indigo
------------------------------------------------------------------------------
Status: FAIL

        Start   9: dftb+_helical/10-0Ctube
  9/274 Test   #9: dftb+_helical/10-0Ctube ...................***Failed    0.43 sec
helical/10-0Ctube:   TODO.

==============================================================================
TEST SUMMARY
------------------------------------------------------------------------------
Match:
    None

Not run:
    helical/10-0Ctube
------------------------------------------------------------------------------
Status: FAIL

        Start  10: dftb+_helical/10-0CtubeC_10
 10/274 Test  #10: dftb+_helical/10-0CtubeC_10 ...............***Failed    0.42 sec
helical/10-0CtubeC_10:   TODO.

==============================================================================
TEST SUMMARY
------------------------------------------------------------------------------
Match:
    None

Not run:
    helical/10-0CtubeC_10
------------------------------------------------------------------------------
Status: FAIL

        Start  11: dftb+_helical/10-0CtubeC_10_origin
 11/274 Test  #11: dftb+_helical/10-0CtubeC_10_origin ........***Failed    0.44 sec
helical/10-0CtubeC_10_origin:    TODO.

==============================================================================
TEST SUMMARY
------------------------------------------------------------------------------
Match:
    None

Not run:
    helical/10-0CtubeC_10_origin
------------------------------------------------------------------------------
Status: FAIL

        Start  12: dftb+_helical/10-0CtubeC_10_sampled

To Reproduce Run tests

Version: 23.1 FreeBSD 13.2

bhourahine commented 1 year ago

@yurivict Could you please confirm the compiler Clang/GCC and any differences in numerical libraries?

aradi commented 1 year ago

@yurivict Thanks a lot for porting and testing on FreeBSD, unfortunately, we do not have FreeBSD in our test workflow (yet?). Additional to @bhourahine's question, could you also have a look on one of the test outputs, e.g. test/app/dftb+/non-scc/GaAs_2/ within the build folder. Relevant would be the files output and stderror.log.

yurivict commented 1 year ago

The compiler is clang-15.

Libraries used: blas-3.11.0 lapack-3.11.0 openblas-0.3.20

yurivict commented 1 year ago

work/.build/test/app/dftb+/non-scc/GaAs_2/output is empty.

$ cat work/.build/test/app/dftb+/non-scc/GaAs_2/stderror.log
Abort(335105285) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Comm_size: Invalid communicator, error stack:
PMPI_Comm_size(109): MPI_Comm_size(comm=0x0, size=0x825eda7b4) failed
PMPI_Comm_size(66).: Invalid communicator
        0.25 real         0.05 user         0.02 sys
yurivict commented 1 year ago

mpich MPI libraries are used.

mpich-3.4.3

yurivict commented 1 year ago

When WITH_MPI=OFF it links with OpenMPI for some reason and tests succeed.

So there are 2 problems:

  1. Tests fail with mpich
  2. It is unclear how to disable MPI (WITH_MPI=OFF only causes it linking with OpenMPI)
yurivict commented 1 year ago

I believe that this issue is caused by mixup of MPI implementations between dftbplus and some of its dependencies.

Thank you for your help!