darshan-hpc / darshan

Darshan I/O characterization tool
Other
56 stars 27 forks source link

consider dropping explicit CC specification in darshan-util Spack package #1001

Closed carns closed 6 days ago

carns commented 1 month ago

The line in question is here:

https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/darshan-util/package.py#L85

Is this still needed with the current autotools build system (and the fact that the util and runtime portions of the tree are built with different spack packages now)?

Some discussion about why this is dangerous in Spack now at https://github.com/spack/spack/issues/45507. The build system ends up losing any cflags= spec options if we bypass Spack's compiler wrappers.

This is sort of related to #995 in that it was making it difficult to do a spack build of darshan-util with debugging symbols to double check what's happening with zlib-ng.

shanedsnyder commented 1 month ago

I'm not sure that it's necessary here, but it does match this code block from darshan-runtime: https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/darshan-runtime/package.py#L132

I'm not necessarily sure it's needed there either, would need to test this all out. I would assume we have similar issues on runtime side (for normal compiler and similarly MPI compiler) in that case?

shanedsnyder commented 1 month ago

I was just trying out various combinations of Darshan packages (darshan-runtime and darshan-util) and trying to set cflags as you did in the original Spack issue to see if I could get a better grasp on this. Ultimately, I think making this change in both packages seems to be okay -- it definitely doesn't appear to hurt anything from my testing.

For darshan-util, I observe same behavior as you do: removing the line you mention doesn't seem to break anything and allows CFLAGS to be properly set.

For darshan-runtime, I actually don't observe the behavior at all when using the traditional +mpi mode (where we are using an MPI compiler). It seems you could properly set CFLAGS now for darshan-runtime+mpi. I'm not sure what's different between MPI and generic C compilers, but ultimately it appears this line doesn't bypass Spack compiler wrappers in the same way we see with darshan-util: https://github.com/spack/spack/blob/97e691cdbf20904281f1d32a82434c05537fac17/var/spack/repos/builtin/packages/darshan-runtime/package.py#L133. I see the same behavior when removing the line where we explicity set CC to the MPI compiler, so no problem there either.

If I try to build darshan-runtime~mpi (without MPI, using generic C compilers), I do observe the same behavior we see in darshan-util, however. So seems to support there being some difference in behavior between self.spec["mpi"].mpicc and self.compiler.cc.

I'll open up a PR on Spack for Darshan packages to make this change and request your review so you can sanity check, but I'm on board with the changes.

scheibelp commented 1 month ago

So seems to support there being some difference in behavior between self.spec["mpi"].mpicc and self.compiler.cc.

It might be worth me differentiating the following:

So generally, we expect self.spec[mpi].mpicc will use the Spack compiler wrappers, and spack.compiler.cc (being a direct reference to the underlying compiler) will not.

shanedsnyder commented 1 month ago

Thanks for the explanation @scheibelp, that seems to make sense based on the initial observations I had.

I will try to get a PR up soon for changes to Darshan's Spack packages to make sure we aren't sidestepping the Spack compiler wrappers going forward.

shanedsnyder commented 6 days ago

This has been addressed in our Spack package, so closing this issue.