Open dingp opened 2 years ago
Changing the "target release" to v3.0.0
in the Projects for this issue.
This is an enhancement which is not critically needed at the first rollout of spack release.
py-moo
is also only required for build. That will bring down the number of packages (especially those py-extension packages) in the loaded running environment.
I agree that we should go over the package.py
s for the individual DUNE-DAQ packages (appfwk
, etc.) and make sure the dependencies are as lightweight as possible. However, I don't believe this is something that makes sense for the umbrella packages (dunedaq
, etc.). E.g., what would it mean to say that cmake
is a build-only dependency of devtools
since devtools
is based on BundlePackage
, and by definition isn't something that has a concept of "building" or "linking"?
Having said that, obviously we'd want to distinguish the set of packages for dbt-setup-workarea
vs. dbt-setup-release
. Perhaps we could have a build variant - e.g., releaseonly
- where we'd specify that devtools
is not a dependency of externals
in the case of externals +releaseonly
, etc.
I didn't realize that BundlePackage
does not have dependency types (which makes sense). In that case, defining different build variants for BundlePackage
archives the same goal.
Should daq-cmake be loaded in when someone runs dbt-setup-release
? Right now the only reason this is necessary is because many packages which use Python to generate configurations use the following line to tell moo
where to find schema files:
moo.io.default_load_path = get_moo_model_path()
where get_moo_model_path()
is the one and only function in the dunedaq
Python module in daq-cmake, a one liner which does this:
return [os.path.join(p, 'schema') for p in os.environ.get("DUNEDAQ_SHARE_PATH", "").split(':')]
It seems there are currently a few options:
1) Load in daq-cmake in dbt-setup-release
, adding logic to its package.py
so cmake and py-pybind11 don't also get loaded in
2) Drop daq-cmake as something people need for a release environment, sacrifice get_moo_model_path
, and have developers just write the list comprehension themselves
3) Don't load in daq-cmake, but have dbt-setup-release
add daq-cmake's python/
directory to the paths Python searches for modules
Or have a daq-cmake
variant containing the python
directory and no cmake, pybind11
dependency?
That's (1) in my list, right? I've been working on giving dunedaq
a dev
variant, where if you install it with ~dev
it leaves out the dependency on daq-cmake
. What we could do is leave daq-cmake
in but propagate the dev
variant to it so it can leave out cmake and pybind11, e.g.:
depends_on("cmake", when="+dev")
What about moving get_moo_model_path()
somewhere else and completely drop daq-cmake
?
So the question then becomes, where should "somewhere else" be, when we're in a release (and not a development) environment?
Also on the topic of a release environment: doesn't it seem like the demo described in https://dune-daq-sw.readthedocs.io/en/latest/packages/daqconf/InstructionsForCasualUsers/ should work in a release environment? Right now it doesn't because the scripts expect the DBT_WORKAREA_ENV
environment variable to be set.
To describe where things stand on the johnfreeman/129_update_dependencies
branch at this point:
dunedaq
umbrella package has a dev
variant. When installed with +dev
we basically recover its "traditional" contents. When built with ~dev
, dependencies for developers get left out. What that means is described below.~dev
build of dunedaq
means that devtools
(via externals
) gets left out as a dependencydaq-cmake
has been downgraded to a build-only dependency of DUNE DAQ packages. It's also had cmake
and py-moo
dependencies added.daq-cmake
- for now - remains a dependency of dunedaq
, but the ~dev
variant of dunedaq
propagates to daq-cmake
s.t. it doesn't drag in its dependencies (cmake
, etc.) py-moo
dependencyspack info <packagename>
is now discovered automatically by the make-release-repo.py
script, so packages which don't have official documentation pages at least have their GitHub pages listedChanging the target release to v3.1.0
since this will require longer testing time.
Update since my post back in May:
daq-cmake
's dependency on cmake
to be build
-type only. This means that cmake
doesn't get dragged in when a user loads a ~dev
(production) variant of dunedaq
. daq-cmake
still needs a run
-type dependency on py-moo
and py-pybind11
in order for packages to be able to call all of daq-cmake
's functions without separately specifying these dependencies. The downside is that this means py-moo
and py-pybind11
Spack packages get loaded in even in the production environment. daq-cmake
's get_moo_model_path
python function, I've upgraded their dependency on daq-cmake
s.t. they not only have a build
-type dependency but a run
-type dependency as well. ~dev
variant, dunedaq
no longer explicitly depends on daq-cmake
, even though as mentioned above some individual packages will use it at runtimegcc
are actually needed in the runtime environment, as is openssh
which he's added to the systems
umbrella package since I last worked on this Issue. As such, we'll need to change things up a bit from our current model of "the systems
umbrella package is loaded via the devtools
package, but devtools
is left out in the ~dev
variant of dunedaq
"Also something potentially useful when thinking about how to wrap this up: I was incorrect in claiming on April 28th that a BundlePackage
-based umbrella package can't have a type
. Indeed it can, in the sense that, e.g., the build
type should apply during spack install <umbrella package>
and the run
type should apply during spack load
. So, another tool in our toolkit.
We've arrived at a fairly straightforward place.
run
-type dependencies on py-moo
since moo
is supplied in the Python virtual environment. Thus py-moo
is no longer in dunedaq
like it had been earlier on this feature branch.externals+dev
depends on devtools
, externals~dev
depends on systems
. Previously externals~dev
didn't even depend_on
systems
, but Pengfei pointed out that openssh
and parts of gcc
, both dependencies of systems
, are needed in the production environment. dunedaq~dev
rather than dunedaq+dev
are: cmake
, gdb
and ninja
- i.e., the devtools
dependencies. See comment in https://github.com/DUNE-DAQ/daq-deliverables/issues/107 from a couple minutes ago for an update.
Defining proper
run
orbuild
type of a dependency helps to minimize the number of packages loaded by spack. It also helps to get a precise set of packages needed for the running environment. This will help to produce a slimmer docker image for running, which will be very useful for Kubernetes.To achieve this, we will need to go through all the
depends_on
calls in the release repo templatespackage.py
files, and add dependency types to thedepends_on
call. Thedepends_on
entries in umbrella packages' package.py files are generated from release YAML files. The YAML file will need to have an additional attribute (e.g.dependent_type
) for each of the external dependency, and DAQ package.Even if a dependency meets all the types of
run
,build
,link
, it helps to puttype=('build', 'link', 'run')
explicitly in thedepends_on
call.Some examples:
In
daq-release/spack-repos/externals/packages/devtools/package.py
will be replaced by
In
daq-release/spack-repos/release-repo-template/packages/externals/package.py
will be replaced by
For the release YAML file, the changes will be like the following:
will be changed into:
Additionally, the
make-release-repo.py
script needs update to parse thedependent_type
attribute, especially convert the python list into tuple since YAML does not support tuple type but list type only, and add them to thedepends_on
entries.