conda-forge / openmpi-feedstock

A conda-smithy repository for openmpi.
BSD 3-Clause "New" or "Revised" License
9 stars 25 forks source link

Rerender for v10 gcc compilers #92

Closed carterbox closed 2 years ago

carterbox commented 2 years ago

The purpose of this PR is to get the openmpi-mpifort package built against the v10 fortran compiler.

There are a few possible approaches:

  1. Update CUDA compiler to 11.1 from 10.2, so the host compiler moves from gcc=7 to gcc=10.
  2. Enable a non-cuda build, which automatically tracks the current compiler pinnings.
  3. Relax compiler constraints so package built with gcc=7 can be used by gcc=10

Checklist

Closes #91

conda-forge-linter commented 2 years ago

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipe) and found it was in an excellent condition.

carterbox commented 2 years ago

@conda-forge-admin, please rerender

conda-forge-linter commented 2 years ago

Hi! This is the friendly automated conda-forge-linting service.

I wanted to let you know that I linted all conda-recipes in your PR (recipe) and found some lint.

Here's what I've got...

For recipe:

carterbox commented 2 years ago

@conda-forge-admin please rerender

github-actions[bot] commented 2 years ago

Hi! This is the friendly automated conda-forge-webservice.

I tried to rerender for you but ran into some issues. Please check the output logs of the latest rerendering GutHub actions workflow run for errors. You can also ping conda-forge/core for further assistance or try re-rendering locally.

This message was generated by GitHub actions workflow run https://github.com/conda-forge/openmpi-feedstock/actions/runs/1947810438.

conda-forge-linter commented 2 years ago

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipe) and found it was in an excellent condition.

leofang commented 2 years ago

@conda-forge-admin, please rerender

github-actions[bot] commented 2 years ago

Hi! This is the friendly automated conda-forge-webservice.

I tried to rerender for you, but it looks like there was nothing to do.

This message was generated by GitHub actions workflow run https://github.com/conda-forge/openmpi-feedstock/actions/runs/1948551161.

carterbox commented 2 years ago
The following NEW packages will be INSTALLED:

    clang:          13.0.1-h694c41f_0         conda-forge
    clang-13:       13.0.1-default_he082bbe_0 conda-forge
    libclang-cpp13: 13.0.1-default_he082bbe_0 conda-forge
    libcxx:         12.0.1-habf9029_1         conda-forge
    libgfortran:    5.0.0-9_3_0_h6c81a4c_23   conda-forge
    libgfortran5:   9.3.0-h6c81a4c_23         conda-forge
    libllvm13:      13.0.1-h64f94b2_2         conda-forge
    libzlib:        1.2.11-h9173be1_1013      conda-forge
    llvm-openmp:    13.0.1-hcb1a161_1         conda-forge
    mpi:            1.0-openmpi               conda-forge
    openmpi:        4.1.2-h4ff0e22_1          local      
    openmpi-mpicc:  4.1.2-h5eb16cf_1          local      
    zlib:           1.2.11-h9173be1_1013      conda-forge

Preparing transaction: ...working... done
Verifying transaction: ...working... done
Executing transaction: ...working... done
export PREFIX=/Users/runner/miniforge3/conda-bld/openmpi-mpi_1646695176261/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_pl
export SRC_DIR=/Users/runner/miniforge3/conda-bld/openmpi-mpi_1646695176261/test_tmp
+ export OMPI_MCA_plm=isolated
+ OMPI_MCA_plm=isolated
+ export OMPI_MCA_btl_vader_single_copy_mechanism=none
+ OMPI_MCA_btl_vader_single_copy_mechanism=none
+ export OMPI_MCA_rmaps_base_oversubscribe=yes
+ OMPI_MCA_rmaps_base_oversubscribe=yes
+ MPIEXEC=/Users/runner/miniforge3/conda-bld/openmpi-mpi_1646695176261/test_tmp/mpiexec.sh
+ pushd tests
+ [[ openmpi-mpicc == \o\p\e\n\m\p\i ]]
+ [[ openmpi-mpicc == \o\p\e\n\m\p\i\-\m\p\i\c\c ]]
+ command -v mpicc
+ mpicc -show
~/miniforge3/conda-bld/openmpi-mpi_1646695176261/test_tmp/tests ~/miniforge3/conda-bld/openmpi-mpi_1646695176261/test_tmp
$PREFIX/bin/mpicc
x86_64-apple-darwin13.4.0-clang -I$PREFIX/include -I$PREFIX/include -L$PREFIX/lib -L$PREFIX/lib -Wl,-rpath,$PREFIX/lib -lmpi
+ mpicc helloworld.c -o helloworld_c
--------------------------------------------------------------------------
The Open MPI wrapper compiler was unable to find the specified compiler
x86_64-apple-darwin13.4.0-clang in your PATH.

Note that this compiler was either specified at configure time or in
one of several possible environment variables.

OSXx64 does not like having the flexible compiler spec. Reverted the flexible compiler spec for OSX because the problem that this PR is trying to fix only occurs on the builds that have a CUDA/nonCUDA variant.

carterbox commented 2 years ago

No rerender needed.

leofang commented 2 years ago

@carterbox If it helps, maybe we wanna try bumping the CUDA version (which is zipped with the compiler versions)? We chose 10.2 as the lowest common denominator supported in CF, but there's no real reason to stick to it.

carterbox commented 2 years ago

@leofang, We would need to bump the CUDA version to 11.1, then I think it would solve the problem because the compiler versions of no-CUDA and CUDA would match.

However, I really think having a separate no-CUDA and CUDA build is the simplest for maintenance because you don't have to think about whether the compilers match AND the CUDA compiler can be the lowest common denominator.

Lemme know what you want.

carterbox commented 2 years ago

My original attempt (stopping with https://github.com/conda-forge/openmpi-feedstock/pull/92/commits/04fd665b88f0a5f75e0c64f1297a8763348d7728) doesn't do anything useful because, as I tried to explain, there is currently only one linux build and it uses gcc=7 instead of gcc=10. Rerendering doesn't change anything because the gcc version is locked to the cuda version.

I will use the approach https://github.com/conda-forge/openmpi-feedstock/issues/91#issuecomment-1062615457 where we just don't use the openmpi-mpicc/fortran metapackages in recipes.