Closed ltalirz closed 3 years ago
Hi Leopold, I think it would be great! Thanks for trying.
For HDF5, for sure you need to add dependencies on netcdf and netcdf-fortran. If yambo is compiled with --enable-hdf5-par-io
(which I'd suggest), then the HDF5 library must have support for parallel I/O.
The linking of the libraries needs to be specified via the configure. (We should work on the "automatic detection", but so far it does not work much unfortunately.) Few tips:
$PATH
, which contains the lib
and include
folders then you can use --with-hdf5-path=$PATH
Example
--with-hdf5-path="/usr/locale/hdf5/" --with-netcdf-path="/usr/locale/netcdf/"
$libdir
and $includedir
are independent, then you can use --with-hdf5-libdir=$libdir
and --with-hdf5-includedir=$includedir
The same for netcdf (netcdff are automatically detected if they share the same folder as netcdf)
Example
--with-hdf5-libdir="/usr/lib/x86_64-linux-gnu/" --with-hdf5-includedir="/usr/include" \
--with-netcdf-libdir="/usr/lib/x86_64-linux-gnu/" --with-netcdf-includedir="/usr/include"
Similarly for other libraries. I'd suggest linking the external fftw, in place of the internal ones.
It would be also nice to have yambo compiled with --enable-slepc-linalg
.
This will introduce extra dependencies on slepc and petsc, which again can be linked with the above options
Last, in case the linking of external library does not work.
It is possible to compile yambo with the option --with-extlibs-path=$MY_PATH_FOR_YAMBO_LIBS
The folder $MY_PATH_FOR_YAMBO_LIBS
should be an empty path, where the installation process will put all the internally compiled libraries (such as netcdf and hdf5). Keeping the same option every time yambo is recompiled, the libraries will be correctly linked.
I do not know much about conda. I know @nicspalla is working on a docker version
thanks @sangallidavide !
I've added the options you suggested and the build already seems to proceed quite a bit further. It's now stuck at
>>>[Making qe_pseudo]<<<
<command-line>: warning: ISO C99 requires whitespace after the macro name
Makefile:102: *** missing separator (did you mean TAB instead of 8 spaces?). Stop.
make[2]: Entering directory '$SRC_DIR/lib/qe_pseudo'
make[2]: Leaving directory '$SRC_DIR/lib/qe_pseudo'
make[1]: *** [config/mk/actions/compile_yambo.mk:2: yambo] Error 2
make[1]: Leaving directory '$SRC_DIR'
yambo build failed
Nice. Can you also add --enable-open-mp
?
I do not see the line where the configure of yambo is called. It seems the external NETCDF and HDF5 are detected fine. I understand they are in the $PREFIX path (?) If you didn't already, try the same with fftw, slepc and petsc
--with-fft-path="$PREFIX" \
--with-slepc-path="$PREFIX" \
--with-petsc-path="$PREFIX"
For the error with lib/qe_pseudo
I suspect something is not fine with the conda precompilers (FPP and/or CPP).
These two lines in the yambo report
# [ CPP ] $BUILD_PREFIX/bin/x86_64-conda-linux-gnu-cpp -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem $PREFIX/include
# [ FPP ] $BUILD_PREFIX/bin/x86_64-conda-linux-gnu-gfortran -E -P -cpp
Can you try something like
CPP="gcc -E -P"
FPP="gfortran -E -P -cpp"
thanks @sangallidavide , that resulted in further progress. now hanging at the bse build:
>>>[Making bse]<<<
<command-line>: warning: ISO C99 requires whitespace after the macro name
make[2]: Entering directory '$SRC_DIR/src/bse'
cd $SRC_DIR/src/bse; $SRC_DIR/sbin/moduledep.sh K_blocks.o K_driver.o K_IP.o K_Transitions_setup.o K_WF_phases.o K.o K_compress.o K_correlation_collisions.o K_exchange_collisions.o K_correlation_kernel.o K_exchange_kernel.o K_solvers.o K_Haydock.o K_Haydock_response.o K_screened_interaction.o K_inversion_do_it_full.o EPS_via_perturbative_inversion.o K_inversion_driver.o K_diagonal.o K_inversion_Lo.o K_inversion_restart.o K_inversion_engine.o K_diago_driver.o K_diago_non_hermitian_residuals.o K_diago_hermitian_residuals.o K_diago_perturbative.o K_diago_response_functions.o K_eps_interpolate.o K_output_file.o K_multiply_by_V.o K_dot_product.o K_components_folded_in_serial_arrays.o K_stored_in_a_big_matrix.o K_observables.o K_diago_kerr_residual.o K_diago_magnons_residual.o PL_diago_residual.o PL_via_perturbative_inversion.o K_stored_in_a_slepc_matrix.o K_shell_matrix.o K_multiply_by_V_slepc.o K_multiply_by_V_transpose_slepc.o > $SRC_DIR/src/bse/make.dep
K_stored_in_a_slepc_matrix.f90:700:93:
700 | call MatSetValue( slepc_mat, H_pos(1), H_pos(2), Mij , INSERT_VALUES, ierr )
|
Error: Type mismatch in argument 'va' at (1); passed COMPLEX(8) to REAL(8) 1
I do not see the line where the configure of yambo is called.
That's in the build.sh
file in the PR https://github.com/conda-forge/staged-recipes/pull/14727
Ok, this is because the slepc and petsc libraries can be build either for real or for complex algebra operations. Likely the ones linked here
# [ E ] PETSC : -L$PREFIX/lib -lpetsc -ldl
# -I$PREFIX/include
# [ E ] SLEPC : -L$PREFIX/lib -lslepc
# -I$PREFIX/include
are the ones for real algebra, while yambo needs the ones for complex algebra (the yambo configure should detect this, but we did not code it yet ... )
In my experience, it is usually possible to load either the standard slepc, i.e. 3.13.4
in your case, or the complex version, could be called 3.13.4_complex
or something similar. Same for petsc.
One also needs to check which is the resulting name of the library.
The yambo configure expects libslepc.a
, but it maybe libslepc_complex.a
In case the name is libslepc_complex.a, a solution is to use (in place of --with-slepc-path
and --with-petsc-path
)
--with-slepc-libs="-L$PREFIX/lib -lslepc_complex" --with-slepc-includedir="$PREFIX/include"
--with-petsc-libs="-L$PREFIX/lib -lpetsc_complex" --with-petsc-includedir="$PREFIX/include"
Last, if none of the two works, we can switch back to the internally compiled ones ...
Thanks - it seems that the conda-forge packages of petsc and slepc don't provide the variant with complex numbers yet.
I'll have a look into adding those, but in the meanwhile I'll disable it for the yambo package (can be re-enabled later).
By the way, would you like me to add you or someone else as a co-maintainer of the conda-forge feedstock repository? Just let know.
I'll have a look into adding those, but in the meanwhile I'll disable it for the yambo package (can be re-enabled later).
Ok. You can always use the internally compiled slepc from yambo (just remove the --with-slepc/ptesc-*
options but keep the --enable-slepc-linalg
on).
By the way, would you like me to add you or someone else as a co-maintainer of the conda-forge feedstock repository?
@nicspalla: can you do that?
I close this for now.
After some further fixes, the build now passes https://github.com/conda-forge/staged-recipes/pull/14727
However, the conda-forge admins are likely to ask for running at least a few tests of the executable (and even if they don't it would make sense to add this later in order to ensure that the executable actually works). Is there a command that can be used to run a couple of small tests?
We have a test-suite. But it is is a separate repo, which needs to be downloaded and configured. Also so far the repo is private (I do not see issues in moving it to public). Let me double check with other developers.
By the way, would you like me to add you or someone else as a co-maintainer of the conda-forge feedstock repository? Just let know.
Hi Leopold, I'm really interested on this. So, you can add me as a co-maintainer. I suppose that I need to have an account on conda-forge...
Hi Leopold, I'm really interested on this. So, you can add me as a co-maintainer.
Cheers!
I suppose that I need to have an account on conda-forge...
Not necessary, a github handle is enough. I'll take care of it
We have a test-suite. But it is is a separate repo, which needs to be downloaded and configured. Also so far the repo is private (I do not see issues in moving it to public). Let me double check with other developers.
Ok, the yambo-tests repo is now public. It is a perl tool. Not sure if it can be useful for conda.
It works as follows. Preparation:
- clone the repo
- ./configure --with-yambo-path="$PATH_TO_YAMBO" (path to the yambo compiled source)
- ./driver.pl -d all (It will download many tar.gz files with prepared databases, i.e. netcdf files. 3 GB in total...)
Run tests:
- ./driver.pl -tests all
It runs many tests and takes I think about 30 min on a standard desktop. One can also select fewer or more tests.
More documentation on the test-suite can be found on the wiki page
http://www.yambo-code.org/wiki/index.php?title=Test-suite-simple http://www.yambo-code.org/wiki/index.php?title=Test-suite
Thanks guys! I'll wait for the first review from the conda-forge maintainers. If they request the test suite run, I'll add it there; otherwise I'll add it after the PR has been merged.
Hey @sangallidavide - the guys from conda-forge finally merged our pull request (usually a matter of days, I guess they were unusually busy).
There's now a new way to install yambo on linux: conda install -c conda-forge yambo
.
I'll be looking into adding macos build as well when I find time.
As for the tests, where can I find this yambo-tests
repository? The documentation says https://github.com/yambo-code/yambo-tests.git but that link doesn't work.
Hi guys, I was wondering whether there is interest in creating a conda package for yambo.
While a conda yambo won't get you optimal performance for your machine, the pre-built binaries make it much easier and faster for new prospective users to give it a try (most other MaX codes already have one).
I've started compiling a conda recipe here but I'm running into issues at the configure stage - yambo doesn't seem to like the C preprocessor
Also, it tries to compile hdf5 although I've already added it as a dependency.
Could you perhaps help me tweak the compilation?
In the PR https://github.com/conda-forge/staged-recipes/pull/14727 there is the
meta.yaml
that describes the dependencies, and thebuild.sh
that contains the commands to build.E.g. I'm wondering whether I need to add, on top of hdf5, the netcdf and netcdf-fortran dependencies and whether I need to explicitly link against them using the configure options or whether the configure should autodiscover them.