Open rezaplasma opened 1 year ago
Hi @rezaplasma,
Thanks for the question!
We implement multiple diagnostics in WarpX, ultimately moving fully to openPMD. Our openPMD output is either in .h5
(HDF5) or .bp
(ADIOS2): https://warpx.readthedocs.io/en/latest/dataanalysis/formats.html
when i run (script).py with python the results are not as ().h5, i would just to know how i can create HDF5 files does i need add feature in my script or not?
Yes, there is an option to add until we change the default. In PICMI Python input, the options you need to add are:
ParticleDiagnostic( # ...
warpx_format='openpmd',
warpx_openpmd_backend='h5'
)
and similar for FieldDiagnostic
.
Does this work for you? Let us know if it does or if you have further questions :)
hi @ax3l
thanks for your answering
I will try it and let you know
does these
warpx_format='openpmd',
and
warpx_openpmd_backend='h5'
are arguments for
ParticleDiagnostic
and FieldDiagnostic
objects ?
hi @ax3l i have set these arguments but i have again faced this warning
--- INFO : Writing openPMD file diags/Python_LaserAcceleration_1d_plt000000
terminate called after throwing an instance of 'openPMD::error::WrongAPIUsage'
what(): Wrong API usage: openPMD-api built without support for backend 'HDF5'.
does i must add an option warpx_file_prefix
what's the issue?
thank you
Hi @rezaplasma,
Thank you for the update!
How did you install WarpX? Do you mind sharing the exact details (setup of dependencies, CMake commands and output)? It looks like HDF5 was not provided during compilation, and thus cannot be used.
Best, Axel
Hi @ax3l
Thanks for replying This is the error that i got
--- INFO : Writing openPMD file ./Python_LaserAcceleration_1d_plt000000
terminate called after throwing an instance of 'openPMD::error::WrongAPIUsage'
what(): Wrong API usage: openPMD-api built without support for backend 'HDF5'.
SIGABRT
/usr/bin/addr2line: '/home/user/warp_x/warpx': No such file
/usr/bin/addr2line: '/home/user/warp_x/warpx': No such file
/usr/bin/addr2line: '/home/user/warp_x/warpx': No such file
/usr/bin/addr2line: '/home/user/warp_x/warpx': No such file
.
.
.
I have installed the warpx by using the pypi
package
On 2 local computers but error was same
First i used the Apt
tutorials to install warpx dependencies
Then used these commands respectively:
1) python3 -m pipi install -U pip setuptools wheel
2) python3 -m pipi install -U cmake
3) python3 -m pip wheel -v git+... WarpX.git
and ending
4) python3 -m pip install *whl
@ax3l It is possible to guide me Solving this issue is taking more than 2 weeks
Hi @rezaplasma,
Thanks for the details.
So as mentioned above, your lack some optional WarpX dependencies, the right variant of HDF5 in particular.
Since you followed the Debian/Ubuntu apt
logic, you have everything in place for an MPI-based install. In particular, parallel HDF5.
Try this to switch MPI on (default is OFF currently for the Python bindings), which will then find HDF5:
WARPX_MPI=ON python3 -m pip wheel -v git+https://github.com/ECP-WarpX/WarpX.git
python3 -m pip install --force-reinstall *whl
During compilation, you can already confirm if HDF5 was found checking the table at the end of the CMake output. It should read something like this:
...
openPMD build configuration:
library Version: 0.15.1
openPMD Standard: 1.1.0
C++ Compiler: GNU 11.3.0
/usr/bin/c++
Installation: OFF
Build Type: Release
Library: static
CLI Tools: OFF
Examples: OFF
Testing: OFF
Invasive Tests: OFF
Internal VERIFY: ON
Build Options:
MPI: ON
HDF5: ON <--------------------
ADIOS1: OFF
ADIOS2: ON
PYTHON: OFF
CUDA_EXAMPLES: OFF
...
WarpX build configuration:
Version: 23.06 ()
C++ Compiler: GNU 11.3.0
/usr/bin/c++
Installation prefix: /usr/local
bin: bin
lib: lib
include: include
cmake: lib/cmake/WarpX
Build type: Release
Build options:
APP: ON
ASCENT: OFF
COMPUTE: OMP
DIMS: 3
Embedded Boundary: OFF
GPU clock timers: OFF
IPO/LTO: OFF
LIB: ON
MPI: ON
PSATD: OFF
PRECISION: DOUBLE
PARTICLE PRECISION: DOUBLE
OPENPMD: ON
QED: ON
QED table generation: OFF
SENSEI: OFF
-- Configuring done
-- Generating done
-- Build files have been written to: ...
hi @ax3l
thanks for replying
dr ax3l , i have tried the command mentioned above, but the HDF5 is not switched on , as following:
-- Downloading openPMD-api ...
-- openPMD-api repository: https://github.com/openPMD/openPMD-api.git (0.15.1)
-- Found MPI: TRUE (found version "3.1") found components: CXX
-- Using the single-header code from /tmp/pip-req-build-3suzx7_k/build/temp.linux-x86_64-cpython-39/1/_deps/fetchedopenpmd-src/share/openPMD/thirdParty/json/single_include/
-- nlohmann-json: Using INTERNAL version '3.9.1'
-- toml11: Using INTERNAL version '3.7.1'
-- HDF5 C compiler wrapper is unable to compile a minimal HDF5 program.
-- Could NOT find HDF5 (missing: HDF5_LIBRARIES HDF5_INCLUDE_DIRS C) (found version "")
-- Could NOT find ADIOS2 (missing: ADIOS2_DIR)
openPMD build configuration:
library Version: 0.15.1
openPMD Standard: 1.1.0
C++ Compiler: GNU 11.3.0
/usr/bin/c++
Installation: OFF
Build Type: Release
Library: static
CLI Tools: OFF
Examples: OFF
Testing: OFF
Invasive Tests: OFF
Internal VERIFY: ON
Build Options:
MPI: ON
HDF5: OFF <------------
ADIOS1: OFF
ADIOS2: OFF
PYTHON: OFF
CUDA_EXAMPLES: OFF
-- Found Git: /usr/bin/git (found version "2.34.1")
WarpX build configuration:
Version: 23.06 ()
C++ Compiler: GNU 11.3.0
/usr/bin/c++
Installation prefix: /usr/local
bin: bin
lib: lib
include: include
cmake: lib/cmake/WarpX
Build type: Release
Build options:
APP: OFF
ASCENT: OFF
COMPUTE: OMP
DIMS: 1
Embedded Boundary: OFF
GPU clock timers: OFF
IPO/LTO: OFF
LIB: ON (static)
MPI: ON
PSATD: OFF
PRECISION: DOUBLE
PARTICLE PRECISION: DOUBLE
OPENPMD: ON
QED: ON
QED table generation: OFF
SENSEI: OFF
and at the end of running , i have got the following error:
/home/user/anaconda3/bin/mpicc: line 301: x86_64-conda_cos6-linux-gnu-cc: command not found
failure.
removing: _configtest.c _configtest.o
error: Cannot compile MPI programs. Check your configuration!!!
error: subprocess-exited-with-error
× Building wheel for mpi4py (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
full command: /home/user/anaconda3/bin/python3 /home/user/anaconda3/lib/python3.9/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py build_wheel /tmp/tmps4nifozw
cwd: /tmp/pip-wheel-kpqww6cq/mpi4py_cfd2164865fa4992bdab2a5a92559979
Building wheel for mpi4py (pyproject.toml) ... error
ERROR: Failed building wheel for mpi4py
Successfully built pywarpx
Failed to build mpi4py
ERROR: Failed to build one or more wheels
@ax3l
I installed warp on 2 types of system (local and clustered operating systems) with pypi package, but the HDF5 option is not switched on for both
I can't really figure out what goes wrong!
Hi @rezaplasma,
That is unusual indeed, but we can provide parallel HDF5 one way or another. Which operating system is this, Ubuntu 22.04? Let me double check this in a minimal docker container to see if any package might be missing.
If this turns out to be a problem with apt
, then I will show you how to quickly compile HDF5 yourself for your system.
For the cluster, what operating system is that and do you have a software module that you can load for HDF5?
I tried this in a minimal Ubuntu and can find HDF5, assuming that is your local system:
docker run -it ubuntu:22.04
https://warpx.readthedocs.io/en/latest/install/dependencies.html#apt-debian-ubuntu-linux
apt update
apt install build-essential ccache cmake g++ git libfftw3-mpi-dev libfftw3-dev libhdf5-openmpi-dev libopenmpi-dev pkg-config python3 python3-matplotlib python3-numpy python3-pandas python3-pip python3-scipy python3-venv
git clone https://github.com/ECP-WarpX/WarpX.git $HOME/src/warpx
cd $HOME/src/warpx
cmake -S . -B build -DWarpX_DIMS="1;2;3" -DWarpX_LIB=ON -DWarpX_PSATD=ON -DWarpX_LIB=ON
...
-- openPMD-api repository: https://github.com/openPMD/openPMD-api.git (0.15.1)
-- Found MPI: TRUE (found version "3.1") found components: CXX
-- Using the single-header code from /root/src/warpx/build/_deps/fetchedopenpmd-src/share/openPMD/thirdParty/json/single_include/
-- nlohmann-json: Using INTERNAL version '3.9.1'
-- toml11: Using INTERNAL version '3.7.1'
-- Could NOT find ADIOS2 (missing: ADIOS2_DIR)
openPMD build configuration:
library Version: 0.15.1
openPMD Standard: 1.1.0
C++ Compiler: GNU 11.3.0
/usr/bin/c++
Installation: OFF
Build Type: Release
Library: static
CLI Tools: OFF
Examples: OFF
Testing: OFF
Invasive Tests: OFF
Internal VERIFY: ON
Build Options:
MPI: ON
HDF5: ON
ADIOS1: OFF
ADIOS2: OFF
PYTHON: OFF
CUDA_EXAMPLES: OFF
...
and same for
WARPX_MPI=ON python3 -m pip wheel -v git+https://github.com/ECP-WarpX/WarpX.git
So something is different on your system, in particular there is an issue with the installed HDF5:
-- HDF5 C compiler wrapper is unable to compile a minimal HDF5 program.
-- Could NOT find HDF5 (missing: HDF5_LIBRARIES HDF5_INCLUDE_DIRS C) (found version "")
Can you please provide:
lsb_release -a
cmake --version
which h5pcc
, which h5ls
, h5ls --version
I would also need the complete CMake configuration output and the linked files it presents when you run:
git clone https://github.com/ECP-WarpX/WarpX.git $HOME/src/warpx
cd $HOME/src/warpx
cmake -S . -B build -DWarpX_DIMS="1;2;3" -DWarpX_LIB=ON -DWarpX_PSATD=ON -DWarpX_LIB=ON -DopenPMD_USE_HDF5=ON
Hi @ax3l
Thank you for your attention
I will share the details
Dr ax3l, I have installed warpx on 3 distributions of Linux (22,04 _23,04 and 18,04) but the trouble was same, the details mentioned above is on Linux 22,04.
Should be mentioned that this didn't happen when installing with conda package, and i could get HDF5 output
And i have also worked with FBpic
code but this issue not happened
Please might refer to exact tutorials that you used for installing warp with pypi
package?
Hi @rezaplasma,
Thanks. Yes I will need to details mentioned above to help you further :) Start with the local PC.
Is it possible you have conda activated by default? You might want to explore activating conda manually, as we recommend here: conda config --set auto_activate_base false
That avoids interference.
Update: After setting this, you need to close the terminal and reopen it for the change to become active.
On your local computer, there is no need to rely on apt
, you can also use conda as linked above. On HPC on the other hand, we want to rely on modules and system provided MPI.
hi @ax3l
thanks, I tried conda config --set auto_activate_base false
, but not change anything
the details needed (local system):
(base) user@user-G41MT-D3:~$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 22.04.2 LTS
Release: 22.04
Codename: jammy
(base) user@user-G41MT-D3:~$ cmake --version
cmake version 3.26.4
CMake suite maintained and supported by Kitware (kitware.com/cmake).
(base) user@user-G41MT-D3:~$ which h5pcc
/usr/bin/h5pcc
(base) user@user-G41MT-D3:~$ h5pcc --version
gcc (Ubuntu 11.3.0-1ubuntu1~22.04.1) 11.3.0
Copyright (C) 2021 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE
(base) user@user-G41MT-D3:~$ h5lc --version
Command 'h5lc' not found, did you mean:
command 'h5cc' from deb hdf5-helpers (1.10.7+repack-4ubuntu2)
command 'h5ls' from deb hdf5-tools (1.10.7+repack-4ubuntu2)
command 'h5fc' from deb hdf5-helpers (1.10.7+repack-4ubuntu2)
Try: sudo apt install <deb name>
i have also installed warpx with CMake
before , although the hdf5 didn't get on , but the warpx target
was built after cmake -S . -B build -DWarpX_DIMS="1;2"
in src/warpx/build/bin
path
but, now , when I deleted src folder I faced the error
(base) user@user-G41MT-D3:~/src/warpx$ cmake -S . -B build -DWarpX_DIMS="1" -DWarpX_LIB=ON -DWarpX_PSATD=ON -DWarpX_LIB=ON -DopenPMD_USE_HDF5=ON
-- The C compiler identification is GNU 11.3.0
-- The CXX compiler identification is GNU 11.3.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
CMake Deprecation Warning at CMakeLists.txt:23 (cmake_policy):
The OLD behavior for policy CMP0104 will be removed from a future version
of CMake.
The cmake-policies(7) manual explains that the OLD behaviors of all
policies are deprecated and that a policy should be set to OLD only under
specific short-term circumstances. Projects should be ported to the NEW
behavior and not rely on setting a policy to OLD.
-- Found CCache: /usr/bin/ccache
-- Downloading AMReX ...
-- AMReX repository: https://github.com/AMReX-Codes/amrex.git (d9bae8ce9e69a962154a9340a0fb8ae9895c1fde)
CMake Deprecation Warning at build/_deps/fetchedamrex-src/CMakeLists.txt:26 (cmake_policy):
The OLD behavior for policy CMP0104 will be removed from a future version
of CMake.
The cmake-policies(7) manual explains that the OLD behaviors of all
policies are deprecated and that a policy should be set to OLD only under
specific short-term circumstances. Projects should be ported to the NEW
behavior and not rely on setting a policy to OLD.
-- CMake version: 3.26.4
-- AMReX installation directory: /usr/local
-- Build type set by user to 'Release'.
-- Building AMReX with AMReX_SPACEDIM = 1
-- Configuring AMReX with the following options enabled:
-- AMReX_PRECISION = DOUBLE
-- AMReX_MPI
-- AMReX_MPI_THREAD_MULTIPLE
-- AMReX_OMP
-- AMReX_LINEAR_SOLVERS
-- AMReX_PARTICLES
-- AMReX_PARTICLES_PRECISION = DOUBLE
-- AMReX_PIC
-- AMReX_TINY_PROFILE
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Found MPI_C: /home/user/anaconda3/lib/libmpi.so (found version "3.1")
-- Found MPI_CXX: /home/user/anaconda3/lib/libmpicxx.so (found version "3.1")
-- Found MPI: TRUE (found version "3.1") found components: C CXX
-- Found OpenMP_CXX: -fopenmp (found version "4.5")
-- Found OpenMP: TRUE (found version "4.5") found components: CXX
-- AMReX configuration summary:
-- Build type = Release
-- Install directory = /usr/local
-- C++ compiler = /usr/bin/c++
-- C++ defines = -DAMREX_SPACEDIM=1
-- C++ flags = -O3 -DNDEBUG -fopenmp -Werror=return-type
-- C++ include paths = -I/home/user/src/warpx/build/_deps/fetchedamrex-src/Src/Base -I/home/user/src/warpx/build/_deps/fetchedamrex-src/Src/Base/Parser -I/home/user/src/warpx/build/_deps/fetchedamrex-src/Src/Boundary -I/home/user/src/warpx/build/_deps/fetchedamrex-src/Src/AmrCore -I/home/user/src/warpx/build/_deps/fetchedamrex-src/Src/LinearSolvers/MLMG -I/home/user/src/warpx/build/_deps/fetchedamrex-src/Src/Particle -I/home/user/anaconda3/include
-- Link line = /home/user/anaconda3/lib/libmpi.so /home/user/anaconda3/lib/libmpicxx.so /usr/lib/gcc/x86_64-linux-gnu/11/libgomp.so /usr/lib/x86_64-linux-gnu/libpthread.a
-- AMReX: Using version '23.06' (23.06-6-gd9bae8ce9e69)
-- Downloading PICSAR ...
-- PICSAR repository: https://github.com/ECP-WarpX/picsar.git (1903ecfff51a31a321d39790af90d8520c10537e)
-- Downloading openPMD-api ...
-- openPMD-api repository: https://github.com/openPMD/openPMD-api.git (0.15.1)
-- Found MPI: TRUE (found version "3.1") found components: CXX
-- Using the single-header code from /home/user/src/warpx/build/_deps/fetchedopenpmd-src/share/openPMD/thirdParty/json/single_include/
-- nlohmann-json: Using INTERNAL version '3.9.1'
-- toml11: Using INTERNAL version '3.7.1'
-- HDF5 C compiler wrapper is unable to compile a minimal HDF5 program.
CMake Error at /home/user/anaconda3/lib/python3.9/site-packages/cmake/data/share/cmake-3.26/Modules/FindPackageHandleStandardArgs.cmake:230 (message):
Could NOT find HDF5 (missing: HDF5_LIBRARIES HDF5_INCLUDE_DIRS C) (found
version "")
Call Stack (most recent call first):
/home/user/anaconda3/lib/python3.9/site-packages/cmake/data/share/cmake-3.26/Modules/FindPackageHandleStandardArgs.cmake:600 (_FPHSA_FAILURE_MESSAGE)
/home/user/anaconda3/lib/python3.9/site-packages/cmake/data/share/cmake-3.26/Modules/FindHDF5.cmake:1001 (find_package_handle_standard_args)
build/_deps/fetchedopenpmd-src/CMakeLists.txt:328 (find_package)
-- Configuring incomplete, errors occurred!
this issue really really makes me annoying , and I don't know what I have to do
hi @ax3l
I could eventually solve this issue.
if I'm not mistaken , the conda
package makes this trouble , I have tried to install on another system without this package and the HDF5 is found with MPI = ON , by using the command mentioned above i.e. :
WARPX_MPI=ON python3 -m pip wheel -v git+https://github.com/ECP-WarpX/WarpX.git
If possible, check the correctness of this issue and let me know more
Anyway, I would be grateful if you could answer some of my questions Dr axel:
WARPX_MPI=ON
) fail when we install warpx dependencies without MPI?thank you
thanks, I tried conda config --set auto_activate_base false, but not change anything
I forgot to mention: After setting this, you need to close the terminal and reopen it for the change to become active.
$ h5lc --version
Should read $ h5ls --version
:)
I could eventually solve this issue. [...] the conda package makes this trouble
Glad to hear, yes I think the default-activation of Conda's base
environment, you can see it active on the left of every new line of your command line prompt, caused this issue by providing some additional, but incompatible packages.
Deactivating and reopening the terminal solves it indeed, excellent.
does MPI have to be switched ON to find hdf5?
HDF5 can be installed in two ways: with or without MPI support. If you installed HDF5 with MPI support, then you also need to turn MPI ON for WarpX to use it. Vice versa, if you installed HDF5 without MPI support, then WarpX can only find it if you compile WarpX also with MPI OFF.
I improved the details on this in our documentation now: https://warpx.readthedocs.io/en/latest/install/dependencies.html#apt-debian-ubuntu-linux
what have to do for finding other options (e.g. ADIOS) now the warp is installed?
ADIOS2 is optional and can be used instead of HDF5. You will mostly need it if you run on HPC systems with many MPI ranks, so you can skip it on desktop if HDF5 is running well there.
On HPC, you can ask system support for a module, I suggest ADIOS2 v2.8.3 at this point in time, or compile it yourself with these steps:
# c-blosc (I/O compression)
git clone -b v1.21.1 https://github.com/Blosc/c-blosc.git src/c-blosc
rm -rf src/c-blosc-build
cmake -S src/c-blosc -B src/c-blosc-build -DBUILD_TESTS=OFF -DBUILD_BENCHMARKS=OFF -DDEACTIVATE_AVX2=OFF -DCMAKE_INSTALL_PREFIX=$HOME/sw/c-blosc-1.21.1
cmake --build src/c-blosc-build --target install --parallel 12
# ADIOS2
git clone -b v2.8.3 https://github.com/ornladios/ADIOS2.git src/adios2
rm -rf src/adios2-build
cmake -S src/adios2 -B src/adios2-build -DADIOS2_USE_Blosc=ON -DADIOS2_USE_Fortran=OFF -DADIOS2_USE_Python=OFF -DADIOS2_USE_ZeroMQ=OFF -DCMAKE_INSTALL_PREFIX=$HOME/sw/adios2-2.8.3
cmake --build src/adios2-build --target install -j 12
For this installs to be found by WarpX, you need these environment hints:
export CMAKE_PREFIX_PATH=$HOME/sw/c-blosc-1.21.1:$CMAKE_PREFIX_PATH
export CMAKE_PREFIX_PATH=$HOME/sw/adios2-2.8.3:$CMAKE_PREFIX_PATH
Will this command (WARPX_MPI=ON) fail when we install warpx dependencies without MPI?
See first answer. If you mismatch the HDF5, it will simply not pick it up. If you want it to fail if HDF5 is accidentally mismatched, then add the -DopenPMD_USE_HDF5=ON
option.
hi @ax3l
sorry for getting your reply late and thank you for your comprehensive explanation
sorry, I think I could not convey my meaning well
I mean, in generally, after installing Warpx package how can I compile my desired option (or additional options
) that is not found during installing?
I mean the options mentioned below link: https://warpx.readthedocs.io/en/latest/install/cmake.html
Hi @rezaplasma,
With the instructions above, you should be able to activate all options you need at once in a single build, creating four executables for 1D/2D/RZ/3D simulations and a single Python module, which supports also all four geometries.
Which other options do you like to activate in the link that I am overlooking right now? :)
hi @ax3l
I need the ADIOS module (on clustered operating system)and your instructions are prefect
but I'm a little puzzled that warpx is installed by pypi
package , but the tutorials mentioned above using cmake
.
how can I also provide these options for Python module ?
hi every one I am new in Warp, i have a question about output formats i have read that Warp can write diagnostics data as
openPMD format
(e.g. hdf5), but when i run (script).py with python the results are not as ().h5, i would just to know how i can create HDF5 files does i need add feature in my script or not? thank you