openPMD / openPMD-api

:floppy_disk: C++ & Python API for Scientific I/O
https://openpmd-api.readthedocs.io
GNU Lesser General Public License v3.0
138 stars 51 forks source link

cmake always runs "Running utility command for openPMD.py.cliTools" #1000

Closed eschnett closed 3 years ago

eschnett commented 3 years ago

Whenever I run cmaketo build openPMD, even when nothing changed, cmake output this:

$ cmake --build .
[1/1] Running utility command for openPMD.py.cliTools

and hesitates for a few seconds.

Does this utility need to run every time? Maybe it should run only once, or when the cmake settings change?

ax3l commented 3 years ago

Curious, that's a new Python based CLI tool introduced in #904 that we ship: https://github.com/openPMD/openPMD-api/blob/34b24d92f262590f3da2e5d587e5904640fd76d8/CMakeLists.txt#L1158-L1175

So if at least one of openPMD_BUILD_CLI_TOOLS or openPMD_USE_PYTHON is off, it should not show up.

What system are you on (OS & version + CMake version) that you see the hickup?

ax3l commented 3 years ago

It looks like we could add a MAIN_DEPENDENCY <source-file> to the add_custom_command calls that do such copies (3x in the build logic): https://stackoverflow.com/a/8438510/2719194 https://cmake.org/cmake/help/latest/command/add_custom_command.html

eschnett commented 3 years ago

This is on macOS 11.4. Some software (e.g. cmake, ninja) installed via MacPorts. Python is an Anaconda3 Python version 3.8.5. (The Julia bits is what I added; I'm working on Julia bindings.)

This is my configure output:

$ cmake -DCMAKE_BUILD_TYPE=Debug -DCMAKE_INSTALL_PREFIX=$HOME/openPMD-api -DJlCxx_DIR=$HOME/src/libcxxwrap-julia/build -G Ninja ..
-- Could NOT find MPI_C (missing: MPI_C_LIB_NAMES MPI_C_HEADER_DIR MPI_C_WORKS)
-- Could NOT find MPI_CXX (missing: MPI_CXX_LIB_NAMES MPI_CXX_HEADER_DIR MPI_CXX_WORKS)
-- Could NOT find MPI (missing: MPI_C_FOUND MPI_CXX_FOUND)
    Reason given by package: MPI component 'Fortran' was requested, but language Fortran is not enabled.

-- Using the single-header code from /Users/eschnett/src/openPMD-api/share/openPMD/thirdParty/json/single_include/
-- nlohmann-json: Using INTERNAL version '3.9.1'
-- HDF5 C compiler wrapper is unable to compile a minimal HDF5 program.
CMake Warning at /opt/local/share/cmake-3.19/Modules/FindHDF5.cmake:743 (message):
  HDF5 found for language C is not parallel but previously found language is
  parallel.
Call Stack (most recent call first):
  CMakeLists.txt:168 (find_package)

CMake Warning at CMakeLists.txt:210 (message):
  Found only parallel version of HDF5 but no MPI.  Either set
  openPMD_USE_MPI=ON to force using MPI or set openPMD_USE_HDF5=OFF to
  disable HDF5 or provide a serial install of HDF5.

  If you manually installed a serial version of HDF5 in a non-default path,
  add its installation prefix to the environment variable CMAKE_PREFIX_PATH
  to find it:
  https://cmake.org/cmake/help/latest/envvar/CMAKE_PREFIX_PATH.html

-- Can NOT find 'adios_config' - set ADIOS_ROOT, ADIOS_DIR or INSTALL_PREFIX, or check your PATH
-- Could NOT find ADIOS (missing: ADIOS_LIBRARIES ADIOS_INCLUDE_DIRS) (Required is at least version "1.13.1")
-- Could NOT find ADIOS2 (missing: ADIOS2_DIR)
-- pybind11 v2.6.2
-- pybind11: Using INTERNAL version 2.6.2
-- Found Julia executable: /Users/eschnett/bin/julia
-- Julia_VERSION_STRING: 1.6.2
-- Julia_INCLUDE_DIRS:   /Users/eschnett/julia-1.6/include/julia
-- Julia_LIBRARY_DIR:    /Users/eschnett/julia-1.6/lib
-- Julia_LIBRARY:        /Users/eschnett/julia-1.6/lib/libjulia.1.6.dylib
-- JULIA_HOME:           /Users/eschnett/julia-1.6/bin
-- Julia_LLVM_VERSION:   v11.0.1
-- Julia_WORD_SIZE:      64
-- Found JlCxx version 0.8.3 at /Users/eschnett/src/libcxxwrap-julia/build/lib
-- MPark.Variant: Using INTERNAL version '1.4.0'
-- Catch2: Using INTERNAL version '2.13.4'
-- Python LTO/IPO: ON
-- Note: Skipping example and tool runs (missing openPMD-example-datasets)
-- Note: run
    . /Users/eschnett/src/openPMD-api/share/openPMD/download_samples.sh
to add example files to samples/git-sample/ directory!

openPMD build configuration:
  library Version: 0.14.0
  openPMD Standard: 1.1.0
  C++ Compiler: AppleClang 12.0.0.12000032
    /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/c++

  Installation prefix: /Users/eschnett/openPMD-api
        bin: bin
        lib: lib
    include: include
      cmake: lib/cmake/openPMD
     python: lib/python3.8/site-packages

  Additionally, install following third party libraries:
    MPark.Variant: ON

  Build Type: Debug
  Library: shared
  CLI Tools: ON
  Examples: ON
  Testing: ON
  Invasive Tests: OFF
  Internal VERIFY: ON
  Build Options:
    MPI: OFF
    HDF5: OFF
    ADIOS1: OFF
    ADIOS2: OFF
    JULIA: OFF
    PYTHON: ON

-- Configuring done
-- Generating done
-- Build files have been written to: /Users/eschnett/src/openPMD-api/build
$ cmake --build .
[1/1] Running utility command for openPMD.py.cliTools
eschnett@redshift:~/src/openPMD-api/build (10:52:51)
$ cmake --build .
[1/1] Running utility command for openPMD.py.cliTools
eschnett commented 3 years ago

Upon further inspection, it seems that this "utility command" finishes quite quickly. Afterwards, ninja examines the source file to determine dependencies (i.e. a file called .ninja_deps), and this step takes up to a minute on my system. I don't know why that is, but unless the "utility command" invalidates a cmake/ninja dependency cache, it isn't the slow part.

ax3l commented 3 years ago

Thank you for the details. Does changing the custom targets as in #1016 solve the issue?

eschnett commented 3 years ago

This problem has now been resolved.