Closed jsiirola closed 1 year ago
Does it perhaps happen because the external build is installed because of gfortran conflict reported in #94?
@jhrmnn that seems likely. Shouldn't the external build be rejected, though?
How do we make sure the external build is not considered unless explicitly requested? I thought this is what track_features was for. Ideally, the solver shouldn't be allowed to prefer the external build as a solution to conflicts.
I ran into this too, btw, my library depends on the newer gfortran, and Conda picked the external build instead. Interestingly, Mamba did try to install the regular 4.1.2. So perhaps track_features
does what it's supposed to do, but the Conda installer has a bug?
Maybe it was the special case of only having openmpi, not other packages?
Not sure what you mean by that. Who's having only openmpi?
The repro example above - creates an env with only the openmpi package installed.
I think the repro happens because the openmpi is installed first from the defaults channel, which probably pulls some other dependencies that depend on the new fortran, and when this is then replaced by the conda-forge channel, those dependencies require the new fortran, preventing the loaded 4.1.3 and preferring the external build instead.
Actually the same still happens (even after fixing the fortran issue) when first creating a python env from defaults, conda create -p tmp python
, followed by installing openmpi conda install -p tmp -c conda-forge openmpi
. When creating everything in one go from conda-forge, conda create -p tmp -c conda-forge python openmpi
, it pulls the loaded build correctly.
We don't support mixing defaults
packages and conda-forge
packages.
I opened a ticket about the same thing here: https://github.com/open-mpi/ompi/issues/10300
We have changed our installation so that it installs from the conda-forge
channel, but this still happens. Please feel free to look at the referenced issue above for more details.
@mrmundt, please open a different issue in this repo with the required information.
@isuruf - It's the same issue (@jsiirola and I are both Pyomo developers). I'll add the relevant info here:
We have a suite of MPI tests for our package Pyomo which utilize a Linux + Conda environment. Two days ago, a new version of openmpi
was uploaded to Anaconda, and we are now experiencing what appears to be a symlink breakage.
Previously passing test: https://github.com/Pyomo/pyomo/runs/6032066405?check_suite_focus=true
Currently failing test (because mpirun
command cannot be found): https://github.com/Pyomo/pyomo/runs/6098156452?check_suite_focus=true
Newest version available on Anaconda: linux-64/openmpi-4.1.3-hbea3300_101.tar.bz2
Through Anaconda: conda install openmpi
The system on which we are running is the ubuntu-latest
GitHub Actions runner. Full details can be found here: https://github.com/actions/virtual-environments/blob/main/images/linux/Ubuntu2004-Readme.md
We have previously run conda install openmpi
and been able to run mpirun
with no issues in our test suite (linked above). This is the expected behavior - that openmpi
out of the box with enable mpirun
to be found. Since the most recent update, however, we now get the error mpirun: command not found
UPDATE: We have been digging more into this on our end, and this is actually a particularly strange corner case it seems. When we run conda install openmpi
, we are getting the version 4.0.2
from the default conda channel. However, later, we install cyipopt
that flags openmpi
as a dependency and then installs 4.1.3
from the conda-forge
channel, and somehow in that update, the bin
gets blown away. See below:
% conda create -n mpi
% conda activate mpi
% conda install openmpi
% ls /home/miniconda3/envs/mpi
bin/ conda-meta/ etc/ include/ lib/ share/
% conda install -c conda-forge openmpi
% ls /home/miniconda3/envs/mpi/
conda-meta/ include/ lib/ share/ # NO MORE BIN
This behavior of over-writing openmpi
was not apparent before (and there have been no updates to cyipopt
), so not sure why this is happening now.
Previous behaviour: https://github.com/Pyomo/pyomo/runs/6032066405?check_suite_focus=true#step:14:562 Current behaviour: https://github.com/Pyomo/pyomo/runs/6098156452?check_suite_focus=true#step:14:561
You are still mixing defaults and conda-forge which as I've said before is not a supported use-case for conda-forge.
In our current test suite, we do not use the defaults
channel:
conda install -q -y -c conda-forge \
${PYTHON_CORE_PKGS} ${PYTHON_PACKAGES} ${CONDA_DEPENDENCIES}
if test -z "${{matrix.slim}}"; then
conda install -q -y -c ibmdecisionoptimization 'cplex>=12.10' \
|| echo "WARNING: CPLEX Community Edition is not available"
conda install -q -y -c gurobi gurobi \
|| echo "WARNING: Gurobi is not available"
conda install -q -y -c fico-xpress xpress \
|| echo "WARNING: Xpress Community Edition is not available"
for PKG in cyipopt pymumps; do
conda install -q -y -c conda-forge $PKG \
|| echo "WARNING: $PKG is not available"
done
We do, however, as you can see above, call on other channels for specific solvers which do not exist in the conda-forge
channel.
Do we need to explicitly disallow the defaults channel?
Do we need to explicitly disallow the defaults channel?
Yes. See the README on this repo at https://github.com/conda-forge/openmpi-feedstock#installing-openmpi-mpi
You could also start with miniforge since you are using conda-incubator/miniconda setup.
Thanks, @isuruf. That fixed it. We appreciate the help!
@isuruf Would it be possible to somehow "patch" the conda
pkg manager installed from the conda-forge channel to warn users about misconfigured channel sources? It is not the first time that I see folks struggling with this detail.
Solution to issue cannot be found in the documentation.
Issue
Upgrading
openmpi
from 4.0.2 (frompkgs/main
) to 4.1.3 (fromconda-forge
) breaks the conda environment by deleting thebin
andetc
directories from the environment, renderingmpirun
to not be found. A minimal test script:returns
Installed packages
Environment info