Closed carns closed 1 year ago
Looks good to me as far as my familiarity with spack goes. I'm not sure on the MPI issue though. So far I've only run things on Polaris and there I often run into issues with needing to load the CUDA modules. I'm not sure if that's also the case for theta-gpu, but https://github.com/spack/spack/issues/12520 seemed to have the same issue.
Interesting; thanks for the pointer to the spack issue. I don't think the explicit cuda modules are needed on Polaris, but from the discussion on that issue you linked to, it looks like this might potentially be solvable by specifying the prefix in addition to the module for the external OpenMPI package. I'm going to go ahead and merge the changes in this PR (they are orthogonal I believe) and try out that workaround.
@jhendersonHDF and @vchoi-hdfgroup does this look Ok to you? I'm trying to get it running and got closer after these changes, but then I still hit errors that look like this before it finishes installing everything:
Possibly this is a problem triggered by a recent spack update if you are not seeing it.