Closed sblauth closed 1 year ago
The short answer is you should be able to workaround it if you install fenics-dolfin
directly instead of fenics
.
Looks like we're running into https://github.com/conda/conda-build/issues/3308 (or a related issue) where the hash contents don't propagate to dependencies within the multiple-output recipes, so the different mpi providers are not producing different build strings for the fenics
metapackage. This results in only uploading 4 builds of fenics
instead of the 8 it builds.
Okay, thanks a lot for your help and for rebuilding the packages. With this, I should be able to install the openmpi version too (or just using the fenics-dolfin package).
I have also resolved my issue and for the sake of completeness: Writing .xdmf files in parallel does only work, when the target directory is created before the files are saved to it. In serial (and also perhaps when using a single compute node), the creation of the appropriate directory is done automatically, but when using multiple nodes, the directory has to be created first.
Yeah, the fenics
package doesn't have anything in it - it used to be one actual conda package with everything in it, but it's been split up since then, leaving the empty metapackage around for backward-compatibility. We haven't done that with fenicsx on conda-forge - there is only fenics-dolfinx
, no fenicsx
metapackage.
This should be fixed by #171, at least.
Comment:
I've got a question regarding the MPI build of this package. I noticed that I had some issues writing .xdmf files when running on a cluster with MPI. I noticed that there might have been an issue and that a new build was generated (see #169). However, I still have trouble using fenics from the cluster. The current build (34) apparently only support mpich (trying to install this build with openmpi fails - at least when I want to use python3.11). But I still got issues writing XDMF files from the cluster with that build (no .h5 files are created).
I installed the package with openmpi and the previous build number (33), but that does not seem to have the fix included, so I still run into issues there.
Is there a build of the package with number 34 available for py3.11 for both openmpi and mpich? Am I missing something?
Thanks a lot in advance for your help!