Closed ax3l closed 5 years ago
Thanks for the report !
As a workaround, you can configure --enable-mpi-ext=affinity,cuda ...
.
At first glance, the root cause is this Fortran line is too long, and I suspect this is because you invoke /full_path/configure
(and I guess spack will not change that in a very near future, which is very acceptable for me.
Anyway, I will explore some options on how to avoid this issue.
@hppritcha @bwbarrett FYI, we might want to raise the severity of this issue, and release 4.0.1
earlier than expected (I am still unclear on how we should deal the oshmem
situation that was reported on the mailing list)
@ggouaillardet It's pretty common for us to release an x.y.1 version pretty soon after x.y.0, because inevitably people discover things in .0 that we missed in testing. 🙁
@ggouaillardet I merged #6109 into master. I don't think that this was technically a regression (i.e., long paths would have caused the same problem for quite a while), but the new extensions certainly did shine a light on the issue. So we should definitely get this fix into v4.0.1.
@jsquyres this is not a regression (the issue has always been here) and we only face it now because pcollreq
is the first MPI extension that has some Fortran subroutines. I made #6121 for the v4.0.x
branch
I just updated the title of this issue to reflect that it's not the compiler that is the issue -- it's the very-long-path that is the issue.
I think this is now resolved. Closing.
Background information
What version of Open MPI are you using?
4.0.0
Describe how Open MPI was installed
Trying to install via spack (3557b6e14903c02bcc0a19b34c8b4c64f3d05ca3) from source with GCC/GFortran 4.9.2:
Please describe the system on which you are running
Dependencies:
Details of the problem
Build, configured with:
fails in
make -j 4
with:Full build output: spack-build.out.gz