FEniCS / dolfinx

Next generation FEniCS problem solving environment
https://fenicsproject.org
GNU Lesser General Public License v3.0
731 stars 177 forks source link

issue with build and install the C++ core #2410

Closed kbronik2017 closed 1 year ago

kbronik2017 commented 1 year ago

when I try mkdir build cd build cmake .. make install

I got the following error:

_[ 28%] Building CXX object dolfinx/CMakeFiles/dolfinx.dir/fem/FiniteElement.cpp.o /home/kevinb/FENICSX/dolfinx/cpp/dolfinx/fem/FiniteElement.cpp: In constructor 'dolfinx::fem::FiniteElement::FiniteElement(const ufcx_finite_element&)': /home/kevinb/FENICSX/dolfinx/cpp/dolfinx/fem/FiniteElement.cpp:239:32: error: 'sobolev' in namespace 'basix' does not name a type 239 | static_cast(ce->sobolev_space), | ^~~ /home/kevinb/FENICSX/dolfinx/cpp/dolfinx/fem/FiniteElement.cpp:239:39: error: expected '>' before '::' token 239 | static_cast(ce->sobolev_space), | ^~ /home/kevinb/FENICSX/dolfinx/cpp/dolfinx/fem/FiniteElement.cpp:239:39: error: expected '(' before '::' token 239 | static_cast(ce->sobolev_space), | ^~ | ( /home/kevinb/FENICSX/dolfinx/cpp/dolfinx/fem/FiniteElement.cpp:239:41: error: '::space' has not been declared; did you mean 'isspace'? 239 | static_cast(ce->sobolev_space), | ^~~~~ | isspace /home/kevinb/FENICSX/dolfinx/cpp/dolfinx/fem/FiniteElement.cpp:239:52: error: 'ufcx_basix_custom_finite_element' {aka 'struct ufcx_basix_custom_finite_element'} has no member named 'sobolev_space' 239 | static_cast(ce->sobolevspace), | ^~~~~ make[2]: [dolfinx/CMakeFiles/dolfinx.dir/build.make:272: dolfinx/CMakeFiles/dolfinx.dir/fem/FiniteElement.cpp.o] Error 1 make[1]: [CMakeFiles/Makefile2:304: dolfinx/CMakeFiles/dolfinx.dir/all] Error 2 make: *** [Makefile:136: all] Error 2

is this known to the team or am I doing something wrong?

Thanks

Kevin

jorgensd commented 1 year ago

It seems like your basix installation is out of data, as Sobolev spaces was added to basix a month back: https://github.com/FEniCS/basix/pull/589

kbronik2017 commented 1 year ago

Thank you I have used the following:

pip install fenics-basix {https://pypi.org/project/fenics-basix/, fenics-basix 0.5.0}

but still the same errors!

IgorBaratta commented 1 year ago

To use basix development version, please try:

pip install git+https://github.com/FEniCS/basix.git

Also, which C++ compiler and version are your using?

kbronik2017 commented 1 year ago

I think Cmake output could be helpful:

-- The C compiler identification is GNU 10.4.0 -- The CXX compiler identification is GNU 10.4.0 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /home/kevinb/anaconda3/envs/fenicsx-env/bin/x86_64-conda-linux-gnu-cc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /home/kevinb/anaconda3/envs/fenicsx-env/bin/x86_64-conda-linux-gnu-c++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found MPI_C: /home/kevinb/anaconda3/envs/fenicsx-env/lib/libmpi.so (found suitable version "4.0", minimum required is "3") -- Found MPI_CXX: /home/kevinb/anaconda3/envs/fenicsx-env/lib/libmpicxx.so (found suitable version "4.0", minimum required is "3") -- Found MPI: TRUE (found suitable version "4.0", minimum required is "3")
-- Performing Test HAVE_PIPE -- Performing Test HAVE_PIPE - Success -- Performing Test HAVE_PEDANTIC -- Performing Test HAVE_PEDANTIC - Success -- Performing Test HAVE_DEBUG -- Performing Test HAVE_DEBUG - Success -- Performing Test HAVE_O2_OPTIMISATION -- Performing Test HAVE_O2_OPTIMISATION - Success -- Found Boost 1.74.0 at /home/kevinb/anaconda3/envs/fenicsx-env/lib/cmake/Boost-1.74.0 -- Requested configuration: QUIET REQUIRED COMPONENTS timer -- Found boost_headers 1.74.0 at /home/kevinb/anaconda3/envs/fenicsx-env/lib/cmake/boost_headers-1.74.0 -- Found boost_timer 1.74.0 at /home/kevinb/anaconda3/envs/fenicsx-env/lib/cmake/boost_timer-1.74.0 -- [x] libboost_timer.so.1.74.0 -- Adding boost_timer dependencies: chrono;headers -- Found boost_chrono 1.74.0 at /home/kevinb/anaconda3/envs/fenicsx-env/lib/cmake/boost_chrono-1.74.0 -- [x] libboost_chrono.so.1.74.0 -- Adding boost_chrono dependencies: headers -- Found Boost: /home/kevinb/anaconda3/envs/fenicsx-env/lib/cmake/Boost-1.74.0/BoostConfig.cmake (found suitable version "1.74.0", minimum required is "1.70") found components: timer -- Found Python3: /home/kevinb/anaconda3/envs/fenicsx-env/bin/python3.10 (found version "3.10.6") found components: Interpreter -- Adding /home/kevinb/anaconda3/envs/fenicsx-env/lib/python3.10/site-packages/basix to Basix search hints -- Found PkgConfig: /usr/bin/pkg-config (found version "0.29.2") -- Checking for one of the modules 'PETSc>=3.15;petsc>=3.15' -- Looking for PETSC_USE_COMPLEX -- Looking for PETSC_USE_COMPLEX - not found -- HDF5 C compiler wrapper is unable to compile a minimal HDF5 program. -- Found HDF5: /home/kevinb/anaconda3/envs/fenicsx-env/lib/libhdf5.so (found version "1.12.1") found components: C -- HDF5_DIR: HDF5_DIR-NOTFOUND -- HDF5_DEFINITIONS: -- HDF5_INCLUDE_DIRS: /home/kevinb/anaconda3/envs/fenicsx-env/include -- HDF5_LIBRARIES: /home/kevinb/anaconda3/envs/fenicsx-env/lib/libhdf5.so -- HDF5_HL_LIBRARIES: -- HDF5_C_DEFINITIONS: -- HDF5_C_INCLUDE_DIR: /home/kevinb/anaconda3/envs/fenicsx-env/include -- HDF5_C_INCLUDE_DIRS: /home/kevinb/anaconda3/envs/fenicsx-env/include -- HDF5_C_LIBRARY: -- HDF5_C_LIBRARIES: /home/kevinb/anaconda3/envs/fenicsx-env/lib/libhdf5.so -- HDF5_C_HL_LIBRARY: -- HDF5_C_HL_LIBRARIES: -- Defined targets (if any): -- ... hdf5::hdf5 -- Asking Python module FFCx for location of UFC... (Python executable: /home/kevinb/anaconda3/envs/fenicsx-env/bin/python3.10) -- Found UFCx: /home/kevinb/anaconda3/envs/fenicsx-env/lib/python3.10/site-packages/ffcx/codegeneration (found suitable version "0.5.0", minimum required is "0.5") -- Found MPI_C: /home/kevinb/anaconda3/envs/fenicsx-env/lib/libmpi.so (found version "4.0") -- Found MPI_CXX: /home/kevinb/anaconda3/envs/fenicsx-env/lib/libmpicxx.so (found version "4.0") -- Found MPI: TRUE (found version "4.0")
-- Found ADIOS2: /home/kevinb/anaconda3/envs/fenicsx-env/lib/cmake/adios2/adios2-config.cmake (found suitable version "2.8.3", minimum required is "2.8.1") found components: C CXX MPI -- Checking for one of the modules 'slepc>=3.15' -- Checking for package 'SCOTCH-PT' -- Found SCOTCH (version 6.0.9) -- Performing test SCOTCH_TEST_RUNS -- Performing test SCOTCH_TEST_RUNS - Success -- Found SCOTCH: /home/kevinb/anaconda3/envs/fenicsx-env/lib/libptscotch.so;/home/kevinb/anaconda3/envs/fenicsx-env/lib/libscotch.so;/home/kevinb/anaconda3/envs/fenicsx-env/lib/libptscotcherr.so
-- Performing Test PARMETIS_TEST_RUNS -- Performing Test PARMETIS_TEST_RUNS - Success -- Found ParMETIS: /home/kevinb/anaconda3/envs/fenicsx-env/lib/libparmetis.so;/home/kevinb/anaconda3/envs/fenicsx-env/lib/libmetis.so (Required is at least version "4.0.2") -- The following features have been enabled:

-- The following OPTIONAL packages have been found:

-- The following RECOMMENDED packages have been found:

-- The following REQUIRED packages have been found:

-- The following features have been disabled:

-- -- Copying demo and test data to build directory.


-- Configuring done -- Generating done -- Build files have been written to: /home/kevinb/FENICSX/build

kbronik2017 commented 1 year ago

Also, the pip install git+https://github.com/FEniCS/basix.git, didn't help, still the same issues!

(fenicsx-env) kevinb@kevin-XPS:~/FENICSX/build$ pip install git+https://github.com/FEniCS/basix.git Collecting git+https://github.com/FEniCS/basix.git Cloning https://github.com/FEniCS/basix.git to /tmp/pip-req-build-5qjguglx Running command git clone --filter=blob:none --quiet https://github.com/FEniCS/basix.git /tmp/pip-req-build-5qjguglx Resolved https://github.com/FEniCS/basix.git to commit 2d14f6230b8d5b25f81efb1c701326189e3b81ee Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... done Requirement already satisfied: numpy>=1.21 in /home/kevinb/anaconda3/envs/fenicsx-env/lib/python3.10/site-packages (from fenics-basix==0.5.2.dev0) (1.23.4) Building wheels for collected packages: fenics-basix Building wheel for fenics-basix (pyproject.toml) ... done Created wheel for fenics-basix: filename=fenics_basix-0.5.2.dev0-cp310-cp310-linux_x86_64.whl size=658865 sha256=3d20da0d8daf596c0068574363636a67813492c918a189ee3bc41660adcf2408 Stored in directory: /tmp/pip-ephem-wheel-cache-djge_7gd/wheels/19/86/ea/efd0cff36924ab4050174630ece8a2cbed2b0f897592e0c5ea Successfully built fenics-basix Installing collected packages: fenics-basix Attempting uninstall: fenics-basix Found existing installation: fenics-basix 0.5.0 Uninstalling fenics-basix-0.5.0: Successfully uninstalled fenics-basix-0.5.0 Successfully installed fenics-basix-0.5.2.dev0

jorgensd commented 1 year ago

@mscroggs Any suggestions?

weshouman commented 1 year ago

Following are some files that could help in debugging this issue

jorgensd commented 1 year ago

To me it seems like the aur/archlinux version of Basix is out of date: https://aur.archlinux.org/packages/fenics-basix-git refers to v0.5.0.post0, while you are trying to install the main branch of dolfinx in your Dockerfile.

I currently do not have access to a computer, so i cannot test what happens if you check out the v0.5.0 release tag of dolfinx after cloning it before installing the C++ layer

weshouman commented 1 year ago

checking out v0.5.0 breaks the build much earlier

[  3%] Building CXX object dolfinx/CMakeFiles/dolfinx.dir/common/IndexMap.cpp.o
In file included from /home/builder/dolfinx/cpp/dolfinx/common/IndexMap.h:11,
                 from /home/builder/dolfinx/cpp/dolfinx/common/IndexMap.cpp:7:
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h: In function ‘std::pair<std::vector<int>, std::vector<T> > dolfinx::MPI::distribute_to_postoffice(MPI_Comm, const std::span<const T>&, std::array<long int, 2>, int64_t)’:
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:332:8: error: ‘sort’ is not a member of ‘std’; did you mean ‘sqrt’?
  332 |   std::sort(dest_to_index.begin(), dest_to_index.end());
      |        ^~~~
      |        sqrt
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:350:18: error: ‘find_if’ is not a member of ‘std’; did you mean ‘find’?
  350 |           = std::find_if(it, dest_to_index.end(),
      |                  ^~~~~~~
      |                  find
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:393:14: error: ‘copy_n’ is not a member of ‘std’; did you mean ‘copy’?
  393 |         std::copy_n(std::next(x.begin(), i * shape[1]), shape[1],
      |              ^~~~~~
      |              copy
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:437:8: error: ‘transform’ is not a member of ‘std’
  437 |   std::transform(recv_buffer_index.cbegin(), recv_buffer_index.cend(),
      |        ^~~~~~~~~
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h: In function ‘std::vector<T> dolfinx::MPI::distribute_from_postoffice(MPI_Comm, const std::span<const long int>&, const std::span<const T>&, std::array<long int, 2>, int64_t)’:
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:476:8: error: ‘sort’ is not a member of ‘std’; did you mean ‘sqrt’?
  476 |   std::sort(src_to_index.begin(), src_to_index.end());
      |        ^~~~
      |        sqrt
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:487:23: error: ‘find_if’ is not a member of ‘std’; did you mean ‘find’?
  487 |       auto it1 = std::find_if(it, src_to_index.end(),
      |                       ^~~~~~~
      |                       find
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:530:8: error: ‘transform’ is not a member of ‘std’
  530 |   std::transform(src_to_index.cbegin(), src_to_index.cend(),
      |        ^~~~~~~~~
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:571:14: error: ‘copy_n’ is not a member of ‘std’; did you mean ‘copy’?
  571 |         std::copy_n(std::next(x.begin(), shape[1] * local_index), shape[1],
      |              ^~~~~~
      |              copy
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:580:14: error: ‘copy_n’ is not a member of ‘std’; did you mean ‘copy’?
  580 |         std::copy_n(std::next(post_x.begin(), shape[1] * pos), shape[1],
      |              ^~~~~~
      |              copy
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:618:12: error: ‘copy_n’ is not a member of ‘std’; did you mean ‘copy’?
  618 |       std::copy_n(std::next(x.begin(), shape[1] * local_index), shape[1],
      |            ^~~~~~
      |            copy
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:629:14: error: ‘copy_n’ is not a member of ‘std’; did you mean ‘copy’?
  629 |         std::copy_n(std::next(post_x.begin(), shape[1] * pos), shape[1],
      |              ^~~~~~
      |              copy
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:637:14: error: ‘copy_n’ is not a member of ‘std’; did you mean ‘copy’?
  637 |         std::copy_n(std::next(recv_buffer_data.begin(), shape[1] * pos),
      |              ^~~~~~
      |              copy
make[2]: *** [dolfinx/CMakeFiles/dolfinx.dir/build.make:90: dolfinx/CMakeFiles/dolfinx.dir/common/IndexMap.cpp.o] Error 1
make[1]: *** [CMakeFiles/Makefile2:304: dolfinx/CMakeFiles/dolfinx.dir/all] Error 2
make: *** [Makefile:136: all] Error 2
jorgensd commented 1 year ago

What about v0.5.1?

weshouman commented 1 year ago

v0.5.1 produces the error

[  3%] Building CXX object dolfinx/CMakeFiles/dolfinx.dir/common/IndexMap.cpp.o
In file included from /home/builder/dolfinx/cpp/dolfinx/common/IndexMap.h:11,
                 from /home/builder/dolfinx/cpp/dolfinx/common/IndexMap.cpp:7:
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h: In function ‘std::pair<std::vector<int>, std::vector<T> > dolfinx::MPI::distribute_to_postoffice(MPI_Comm, const std::span<const T>&, std::array<long int, 2>, int64_t)’:
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:332:8: error: ‘sort’ is not a member of ‘std’; did you mean ‘sqrt’?
  332 |   std::sort(dest_to_index.begin(), dest_to_index.end());
      |        ^~~~
      |        sqrt
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:350:18: error: ‘find_if’ is not a member of ‘std’; did you mean ‘find’?
  350 |           = std::find_if(it, dest_to_index.end(),
      |                  ^~~~~~~
      |                  find
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:393:14: error: ‘copy_n’ is not a member of ‘std’; did you mean ‘copy’?
  393 |         std::copy_n(std::next(x.begin(), i * shape[1]), shape[1],
      |              ^~~~~~
      |              copy
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:437:8: error: ‘transform’ is not a member of ‘std’
  437 |   std::transform(recv_buffer_index.cbegin(), recv_buffer_index.cend(),
      |        ^~~~~~~~~
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h: In function ‘std::vector<T> dolfinx::MPI::distribute_from_postoffice(MPI_Comm, const std::span<const long int>&, const std::span<const T>&, std::array<long int, 2>, int64_t)’:
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:476:8: error: ‘sort’ is not a member of ‘std’; did you mean ‘sqrt’?
  476 |   std::sort(src_to_index.begin(), src_to_index.end());
      |        ^~~~
      |        sqrt
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:487:23: error: ‘find_if’ is not a member of ‘std’; did you mean ‘find’?
  487 |       auto it1 = std::find_if(it, src_to_index.end(),
      |                       ^~~~~~~
      |                       find
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:530:8: error: ‘transform’ is not a member of ‘std’
  530 |   std::transform(src_to_index.cbegin(), src_to_index.cend(),
      |        ^~~~~~~~~
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:571:14: error: ‘copy_n’ is not a member of ‘std’; did you mean ‘copy’?
  571 |         std::copy_n(std::next(x.begin(), shape[1] * local_index), shape[1],
      |              ^~~~~~
      |              copy
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:580:14: error: ‘copy_n’ is not a member of ‘std’; did you mean ‘copy’?
  580 |         std::copy_n(std::next(post_x.begin(), shape[1] * pos), shape[1],
      |              ^~~~~~
      |              copy
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:618:12: error: ‘copy_n’ is not a member of ‘std’; did you mean ‘copy’?
  618 |       std::copy_n(std::next(x.begin(), shape[1] * local_index), shape[1],
      |            ^~~~~~
      |            copy
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:629:14: error: ‘copy_n’ is not a member of ‘std’; did you mean ‘copy’?
  629 |         std::copy_n(std::next(post_x.begin(), shape[1] * pos), shape[1],
      |              ^~~~~~
      |              copy
/home/builder/dolfinx/cpp/dolfinx/common/MPI.h:637:14: error: ‘copy_n’ is not a member of ‘std’; did you mean ‘copy’?
  637 |         std::copy_n(std::next(recv_buffer_data.begin(), shape[1] * pos),
      |              ^~~~~~
      |              copy
make[2]: *** [dolfinx/CMakeFiles/dolfinx.dir/build.make:90: dolfinx/CMakeFiles/dolfinx.dir/common/IndexMap.cpp.o] Error 1

Trying out the pip install git+https://github.com/FEniCS/basix.git on the HEAD, the build produced the following error

[ 28%] Building CXX object dolfinx/CMakeFiles/dolfinx.dir/fem/FiniteElement.cpp.o
/home/builder/dolfinx/cpp/dolfinx/fem/FiniteElement.cpp: In constructor ‘dolfinx::fem::FiniteElement::FiniteElement(const ufcx_finite_element&)’:
/home/builder/dolfinx/cpp/dolfinx/fem/FiniteElement.cpp:239:52: error: ‘ufcx_basix_custom_finite_element’ {aka ‘struct ufcx_basix_custom_finite_element’} has no member named ‘sobolev_space’
  239 |             static_cast<basix::sobolev::space>(ce->sobolev_space),
      |                                                    ^~~~~~~~~~~~~
make[2]: *** [dolfinx/CMakeFiles/dolfinx.dir/build.make:272: dolfinx/CMakeFiles/dolfinx.dir/fem/FiniteElement.cpp.o] Error 1
make[1]: *** [CMakeFiles/Makefile2:304: dolfinx/CMakeFiles/dolfinx.dir/all] Error 2
make: *** [Makefile:136: all] Error 2

Note: based on the installed packages, basix 0.5.1.1 is installed and not fenics-basix-git 0.5.0.post0-1

jorgensd commented 1 year ago

A sidenote that you shouldn't need to build petsc4py separately, as petsc should provide this: https://aur.archlinux.org/packages/petsc as PETSc has included petsc4py in their repository for the last 2 years: https://gitlab.com/petsc/petsc/-/tree/main/src/binding/petsc4py

Allthough it seems like: https://github.com/FEniCS/dolfinx/pull/2323/files was not propagated correctly into the v0.5.0 or v0.5.1 release.

@jhale Should we have a patch release with this?

As a side note, you could try using the main branches, as this Dockerfile works nicely for me:

# syntax=docker/dockerfile:1
# Following https://docs.fenicsproject.org/dolfinx/main/python/installation.html
FROM archlinux:base-devel

# Update (-Syu) only for the first run
RUN pacman -Syu --noconfirm
RUN pacman -S git --noconfirm

RUN useradd -ms /bin/bash -G wheel builder
# makepkg requires a sudoer but not the root user
RUN echo "builder ALL=(ALL) NOPASSWD: ALL" >> /etc/sudoers

WORKDIR /opt
RUN git clone https://aur.archlinux.org/yay-git.git
RUN chown -R builder:builder ./yay-git

USER builder
WORKDIR /opt/yay-git
RUN makepkg -si --noconfirm

WORKDIR /home/builder

USER builder
RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S boost
RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S cmake
RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S pkg-config
RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S pugixml
RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S openmpi
RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S hdf5-openmpi
RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S petsc
RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S parmetis
RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S adios2
RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S slepc

RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S python
RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S python-pip

RUN yay --answerdiff=None --noremovemake --pgpfetch --answerclean=None --noconfirm --asdeps -S xtensor

RUN pip install pybind11
RUN pip install numpy
RUN pip install mpi4py
RUN pip install numba
RUN pip install pyvista

#install_dolfinx
RUN python3 -m pip install -v git+https://github.com/FENICS/basix@main
RUN python3 -m pip install -v git+https://github.com/FENICS/ufl@main
RUN python3 -m pip install -v git+https://github.com/FENICS/ffcx@main
RUN git clone --branch=main --single-branch https://github.com/FENICS/dolfinx

RUN mkdir dolfinx/cpp/build 

WORKDIR /home/builder/dolfinx/cpp/build

RUN cmake ../ && \
    make
jhale commented 1 year ago

I have made a new tag of DOLFINx v0.5.2 with the algorithm header file patch applied. I will make the formal GitHub release in a few days.

https://github.com/FEniCS/dolfinx/pull/2323/files

garth-wells commented 1 year ago

v0.5.2 has been released.