Open mkavulich opened 6 months ago
Just checking in on this if there's any more action or information needed from our end for this installation request. We would like to have a test install ready on Hera (or any other UFS tier-1 platform aside from WCOSS) in the next few weeks so we can demonstrate at least a test capability by the end of our period of performance (end of June), is this a realistic timeline?
FWIW it looks like there are not currently recipes for these three packages in Spack. Thankfully, they all have setup.py scripts, so creating recipes should be pretty straightforward. Since this is for SRW, can someone from EPIC take the lead on this? @ulmononian @natalie-perlin @RatkoVasic-NOAA ?
FWIW it looks like there are not currently recipes for these three packages in Spack. Thankfully, they all have setup.py scripts, so creating recipes should be pretty straightforward. Since this is for SRW, can someone from EPIC take the lead on this? @ulmononian @natalie-perlin @RatkoVasic-NOAA ?
Sure, since I never did that, I'm asking Rick Grubin to show me how to do it.
I believe metcalpy is there - I just stumbled over it when working on an NRL plotting package.
I believe metcalpy is there - I just stumbled over it when working on an NRL plotting package.
Which spack
repo / release or branch? Not showing up in the definitive and JCSDA
repos. Apologies if I'm being thick and missing it.
I mixed it up with metpy - my bad
@AlexanderRichert-NOAA @RatkoVasic-NOAA Thanks for your work so far. Let me know if there's any help or info our team can provide to help the process along.
@mkavulich must the components called out in respective requirements.txt
files be pinned to the specific versions stated?, e.g.
metcalcpy
numpy==1.24.2
It seems 'yes' in that, for the simple example stated here, numpy
versions differ across versions of metcalcpy
. These were generated from pip freeze
perhaps?
Thanks.
@rickgrubin-noaa Our group doesn't maintain the MET code so I'm not sure, but I am pretty these are minimum versions, not hard requirements. Is the problem that this numpy version is in conflict with other python packages in spack-stack?
@rickgrubin-noaa Our group doesn't maintain the MET code so I'm not sure, but I am pretty these are minimum versions, not hard requirements. Is the problem that this numpy version is in conflict with other python packages in spack-stack?
Thanks; after consultation with spack-stack
folks, we'll work toward relaxing the numpy
version hard requirement, as other packages run into the same collision.
@RatkoVasic-NOAA, @rickgrubin-noaa, @mkavulich, @climbfuji, just checking to see if there is an update on inclusion of METplotpy, METcalcpy, and METdataio into spack-stack. Thank you!
The blocker is gone (update of versions in package.yaml), my understanding is that @rickgrubin-noaa is still working on this. Once we've got the updates in spack-stack develop, they will be slated for roll-out in spack-stakc-1.8.0 (around early September).
The blocker is gone (update of versions in package.yaml), my understanding is that @rickgrubin-noaa is still working on this. Once we've got the updates in spack-stack develop, they will be slated for roll-out in spack-stakc-1.8.0 (around early September).
Hi @JeffBeck-NOAA -- was away for ~3wks.
Will be merging the latest changes to spack-stack
into my branch and updating the recipes for required components that do not have existing recipes and rebuilding test envs, updates to follow.
Thanks, @rickgrubin-noaa!
We should target numpy 1.25.x if at all possible. That's the latest version that works for all our Python packages - 1.26 breaks many of them.
On hera
, as user myself (not the EPIC
role account), and with latest commits that enforce numpy 1.25.x
and other requirements per configs/common/packages.yaml
:
Requirements for metcalcpy
and metplotpy
need scikit-image
per package requirements:
metcalcpy: scikit-image@0.18.1
or above
metplotpy: scikit-image@0.19.3
As configured, concretization settles on py-setuptools@63.4.3
and scikit-image@0.18.3
for a spack
env that upstream chains to the Intel-based stack v1.7.0
. Trying to use newer versions of py-setuptools
results in concretization errors combined with py-cython
.
When attempting an install, scikit-image@0.18.3
not only doesn't meet requirements for metplotpy
, it also fails to install at all:
961 INFO: C compiler: /scratch1/NCEPDEV/nems/Richard.Grubin/git/spack-stack/spack/lib/spack/env/intel/icc -Wsign-compare -Wunreachable-code -DNDEBUG -g -O3 -Wall -fPIC -fPIC
962
963 INFO: compile options: '-I/scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/install/intel/2021.5.0/python-venv-1.0-ty237wr/in
clude -I/scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/install/intel/2021.5.0/python-3.10.13-mldz2f2/include/python3.10 -c'
964 extra options: '-xKNM -Werror'
965 WARN: CCompilerOpt.dist_test[636] : CCompilerOpt._dist_test_spawn[770] : Command (/scratch1/NCEPDEV/nems/Richard.Grubin/git/spa
ck-stack/spack/lib/spack/env/intel/icc -Wsign-compare -Wunreachable-code -DNDEBUG -g -O3 -Wall -fPIC -fPIC -I/scratch1/NCEPDEV/ne
ms/Richard.Grubin/envs/met.hera/install/intel/2021.5.0/python-venv-1.0-ty237wr/include -I/scratch1/NCEPDEV/nems/Richard.Grubin/en
vs/met.hera/install/intel/2021.5.0/python-3.10.13-mldz2f2/include/python3.10 -c /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.he
ra/install/intel/2021.5.0/py-numpy-1.25.2-2rmiq6b/lib/python3.10/site-packages/numpy/distutils/checks/cpu_avx512_knm.c -o /tmp/tm
pk8t6sb52/scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/install/intel/2021.5.0/py-numpy-1.25.2-2rmiq6b/lib/python3.10/site-p
ackages/numpy/distutils/checks/cpu_avx512_knm.o -MMD -MF /tmp/tmpk8t6sb52/scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/inst
all/intel/2021.5.0/py-numpy-1.25.2-2rmiq6b/lib/python3.10/site-packages/numpy/distutils/checks/cpu_avx512_knm.o.d -xKNM -Werror)
failed with exit status 4 output ->
966 icc: command line warning #10121: overriding '-march=haswell' with '-xKNM'
>> 967 ": internal error: IERROR_MODULE_ID_1102
968
969 compilation aborted for /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/install/intel/2021.5.0/py-numpy-1.25.2-2rmiq6b/lib/
python3.10/site-packages/numpy/distutils/checks/cpu_avx512_knm.c (code 4)
970
971 WARN: CCompilerOpt.feature_test[1575] : testing failed
972 INFO: CCompilerOpt.__init__[1815] : skip features (SSE SSE2 SSE3) since its part of baseline
973 INFO: CCompilerOpt.__init__[1819] : initialize targets groups
...
1375 ^
1376
1377 skimage/feature/_cascade.cpp(112214): warning #1292: unknown attribute "fallthrough"
1378 CYTHON_FALLTHROUGH;
1379 ^
1380
>> 1381 ": internal error: ** The compiler has encountered an unexpected problem.
1382 ** Segmentation violation signal raised. **
1383 Access violation or stack overflow. Please contact Intel Support for assistance.
Need assistance / discussion as to whether or not this work requires using the role account, and the accepted method(s) for explicitly requiring certain package versions / range of versions for requirements.
Marking as blocked; versions for various python
packages, in particular py-setuptools
, upon which a great deal depends:
==> Concretized py-setuptools%intel
- kld7qgg py-setuptools@63.4.3%intel@2021.5.0 build_system=generic arch=linux-rocky8-haswell
[e] mxvoe7u ^glibc@2.28%intel@2021.5.0 build_system=autotools arch=linux-rocky8-haswell
- jcmbfvi ^py-pip@23.1.2%intel@2021.5.0 build_system=generic arch=linux-rocky8-haswell
- blg2gbg ^python@3.10.13%intel@2021.5.0+bz2+crypt+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tkinter+uuid+zlib build_system=generic patches=0d98e93,7d40923,ebdca64,f2fd060 arch=linux-rocky8-haswell
[e] lpyczhf ^bzip2@1.0.6%intel@2021.5.0~debug~pic+shared build_system=generic arch=linux-rocky8-haswell
- isjrtpq ^expat@2.6.2%intel@2021.5.0+libbsd build_system=autotools arch=linux-rocky8-haswell
- wecsnwm ^libbsd@0.12.1%intel@2021.5.0 build_system=autotools arch=linux-rocky8-haswell
- eupt4k5 ^libmd@1.0.4%intel@2021.5.0 build_system=autotools arch=linux-rocky8-haswell
- r3dqdul ^gdbm@1.23%intel@2021.5.0 build_system=autotools arch=linux-rocky8-haswell
[e] xqgzbbl ^gettext@0.19.8.1%intel@2021.5.0+bzip2+curses+git~libunistring+libxml2+pic+shared+tar+xz build_system=autotools patches=9acdb4e arch=linux-rocky8-haswell
[e] fesaeeg ^gmake@3.82%intel@2021.5.0~guile build_system=generic patches=ca60bd9 arch=linux-rocky8-haswell
- 3vzglds ^libffi@3.4.6%intel@2021.5.0 build_system=autotools arch=linux-rocky8-haswell
- 3toxiab ^libxcrypt@4.4.35%intel@2021.5.0~obsolete_api build_system=autotools patches=4885da3 arch=linux-rocky8-haswell
[e] hzm46vt ^perl@5.16.3%intel@2021.5.0~cpanm+opcode+open+shared+threads build_system=generic patches=0eac10e,3bbd7d6 arch=linux-rocky8-haswell
[e] akhzxhm ^ncurses@5.9.20130511%intel@2021.5.0~symlinks+termlib abi=5 build_system=autotools patches=daee321,f84b270 arch=linux-rocky8-haswell
- 3opswd6 ^openssl@3.3.0%intel@2021.5.0~docs+shared build_system=generic certs=mozilla arch=linux-rocky8-haswell
- sgoeeip ^ca-certificates-mozilla@2023-05-30%intel@2021.5.0 build_system=generic arch=linux-rocky8-haswell
[e] ijo4xqm ^pkg-config@0.27.1%intel@2021.5.0+internal_glib build_system=autotools patches=49ffcd6 arch=linux-rocky8-haswell
- slunwfa ^readline@8.2%intel@2021.5.0 build_system=autotools patches=bbf97f1 arch=linux-rocky8-haswell
- draficb ^sqlite@3.43.2%intel@2021.5.0+column_metadata+dynamic_extensions+fts~functions+rtree build_system=autotools arch=linux-rocky8-haswell
- xnwdjcw ^util-linux-uuid@2.38.1%intel@2021.5.0 build_system=autotools arch=linux-rocky8-haswell
[e] xin7ooq ^xz@5.2.2%intel@2021.5.0~pic build_system=autotools libs=shared,static arch=linux-rocky8-haswell
- ks4wym4 ^zlib-ng@2.1.6%intel@2021.5.0+compat+new_strategies+opt+pic~shared build_system=autotools arch=linux-rocky8-haswell
- r6lyonh ^python-venv@1.0%intel@2021.5.0 build_system=generic arch=linux-rocky8-haswell
concretize to a version that won't support minimum package version requirements for:
metcalcpy: scikit-image@0.18.1
or above
metplotpy: scikit-image@0.19.3
With the above concretization, scikit-image
will concretize to v0.18.3
– other python
packages must be upgraded in order to satisfy (at least) metplotpy, thus this is a bigger issue requiring a broader discussion.
Steps to demonstrate package dependencies are concretized to versions that are insufficient to meet requirements for metcalcpy / metdataio / metplotpy:
py-eofs
/ py-imutils
/ py-opencv-python
in addition to py-metcalcpy
/ py-metdataio
/ py-metplotpy
. These are straightforward as they'll install from pypi
.py-metcalcpy
/ py-metdataio
/ py-metplotpy
grubin@hera-hfe06% spack stack create env --site hera --template empty --name met.hera --dir $SCRATCH/envs --overwrite
Configuring basic directory information ...
... script directory: /scratch1/NCEPDEV/nems/Richard.Grubin/git/spack-stack/spack-ext/lib/jcsda-emc/spack-stack/stack
... base directory: /scratch1/NCEPDEV/nems/Richard.Grubin/git/spack-stack/spack-ext/lib/jcsda-emc/spack-stack
... spack directory: /scratch1/NCEPDEV/nems/Richard.Grubin/git/spack-stack/spack
==> Environment /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera exists. Overwriting...
Creating environment from command-line args
Copying site includes from /scratch1/NCEPDEV/nems/Richard.Grubin/git/spack-stack/configs/sites/tier1/hera ...
... to /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/site
Copying common includes from /scratch1/NCEPDEV/nems/Richard.Grubin/git/spack-stack/configs/common/modules_lmod.yaml ...
... to /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/common/modules.yaml
Successfully wrote environment at /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/spack.yaml
Checked user umask and found no issues (0022)
==> Created environment /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera
grubin@hera-hfe06% spack env activate /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera
# edit spack.yaml to add the following specs:
- py-metcalcpy%intel
- py-metdataio%intel
- py-metplotpy%intel
grubin@hera-hfe06% spack concretize 2>&1 | tee $SCRATCH/envs/met.hera/log.concretize
With the above concretization, scikit-image
(for example, which also fails to compile, see above)) will concretize to v0.18.3
(minimum required version is v0.19.3
)– other python
packages, including at least metpy
/ netcdf4
/ pint
must also be upgraded in order to satisfy (at least) metplotpy
.
I suspect that part of the problem is linked to:
py-setuptools:
require: '@63.4.3'
which seems to necessarily downgrade some package versions in order to be satisfied.
File log.concretize
is attached.
I see that this was removed from the list of spack-stack 1.8 features. Can we get an update on the status of this issue? If I am following correctly, the holdup is that there are other packages already in spack-stack that would need to be upgraded in order to accommodate the requirements of metplotpy
?
I see that this was removed from the list of spack-stack 1.8 features. Can we get an update on the status of this issue? If I am following correctly, the holdup is that there are other packages already in spack-stack that would need to be upgraded in order to accommodate the requirements of
metplotpy
?
As described in the comment from August 6, there is at least this issue for attempting to integrate into spack-stack v1.7.0
:
scikit-image
(for example) will concretize to v0.18.3
(minimum required version is v0.19.3
)– other python
packages, including at least metpy
/ netcdf4
/ pint
must also be upgraded in order to satisfy (at least) metplotpy
.
Further, the pinning of py-setuptools
py-setuptools:
require: '@63.4.3'
appears to necessarily downgrade some other package versions -- which then do not satisfy other package requirements -- in order for said pinned version's requirements to be met.
Per today's biweekly spack-stack
meeting, this issue will be deferred until v1.8.0
is released, at which time a newer version of python
(v3.11.7
) will be the default. At that time, package version pinning will be relaxed in an attempt to satisfy requirements for METcalcpy
/ METplotpy
/ METdataio
while still building a viable and valid stack.
@mkavulich This doesn't necessarily mean that you have to wait until the next release. We can provide a test stack on one platform after we figured out the dependencies, and if that works we can make addon environments to the existing 1.8.0 stack available on selected platforms (likely not all),
With spack-stack @ release/1.8.0
Created a unified-dev
env and unpinned versions for
py-setuptools
py-numpy
This yielded a concretized environment that successfully built metcalcpy
/ metdataio
/ metplotpy
with adequately versioned dependent packages. Concretization chose:
py-setuptools@69.2.0
py-numpy@1.25.2
However, it also creates a duplicate concretization, choosing py-setuptools@63.4.3
for py-numpy@1.25.2
and py-matplotlib@3.7.4
(and it seems we must stay at py-matplotlib@3.7.4
per this issue), and duplicate concretizations for many other packages (as identified by show_duplicate_packages.py
).
The entire env successfully installs.
It's not clear to me how to rectify this duplicate concretization -- suggestions / help appreciated, thanks.
Spack allows duplicates for dependencies with type=build, even for unify:true. This happens a lot with py-setuptools and other packages. One solution might be to just ignore it, then blacklist the py-setuptools module (though this is still not an ideal solution as it can still lead to problems when reconcretizing an environment part way through installation). Are you able to pin a single version of py-setuptools? I'm a bit confused as to why it's allowing py-numpy@1.25.2 to be concretized based on the version requirement...
I misspoke above (fixed the comment, I apolgize for confusion).
With spack-stack @ release/1.8.0
For a simple environment for only metcalcpy
/ metdataio
/ metplotpy
:
Pinning py-setuptools@69.2.0
and not py-numpy
eliminates duplicates, but concretizes to py-numpy@1.26.4
which, it seems, is bad juju for other python packages.
Pinning py-numpy@:1.24.2
(note the leading colon) and py-setuptools@69.2.0
yields
==> Error: concretization failed for the following reasons:
1. cannot satisfy a requirement for package 'py-setuptools'.
Pinning py-numpy@1.24.2
(note the absence of a colon) and py-setuptools@69.2.0
yields
==> Error: concretization failed for the following reasons:
1. cannot satisfy a requirement for package 'py-setuptools'.
Pinning py-numpy@1.24.2:
(note the trailing colon) and py-setuptools@69.2.0
concretizes to py-numpy@1.26.4
.
Pinning py-numpy@1.25.2
and py-setuptools@69.2.0
yields [take note!]
==> Error: concretization failed for the following reasons:
1. cannot satisfy a requirement for package 'py-setuptools'.
Further, with this simplified metpy
env to install only metcalcpy
/ metdataio
/ metplotpy,
unpinning both py-setuptools
and py-numpy
results in a concretization:
py-setuptools@69.2.0
py-numpy@1.25.2
This is the same as the unified-dev
concretization, and the concretized environment successfully built metcalcpy
/ metdataio
/ metplotpy
with adequately versioned dependent packages.
However, as above, it also creates a duplicate concretization, choosing py-setuptools@63.4.3
for py-numpy@1.25.2
and py-matplotlib@3.7.4
(and it seems we must stay at py-matplotlib@3.7.4
per this issue), and duplicate concretizations for many other packages (as identified by show_duplicate_packages.py
).
This confuses me; allowing the concretizer to choose package versions is OK, but then pinning packages to those versions is not OK. I'll compare concretization logs of the two scenarios and try to ascertain what's different.
Following on to the previous comment:
Pinning py-numpy
at some version less than 1.26
, e.g. py-numpy@1.25.2
, and unpinning py-setuptools
is viable. There is duplicate concretization with respect to py-setuptools
for a few packages (concretizing at py-setuptools@63.4.3
), notably py-matplotlib
and py-numpy
(!).
Can't seem to come up with anything that doesn't have duplicate concretizations for py-setuptools
.
The culprit is probably numpy@1.25. If you look at the package definition, it limits setuptools to @:63. Try to use py-numpy@1.26 (and remove the pinning of py-shapely entirely - 1.8.0 is ancient).
py-numpy@1.26
-- is this comment still applicable? I have been attempting to adhere to it:
configs/common/packages.yaml
# py-numpy@1.26 causes many build problems with older Python packages
# also check Nautilus site config when making changes here
# https://github.com/JCSDA/spack-stack/issues/1276
py-numpy:
require:
- '@:1.23.5'
I was able to build a decent stack on my dev system (Oracle Linux 9) with gcc@13. It's just a suggestion to see if it helps your problem.
py-setuptools@69.2.0
and py-numpy@1.26.4
concretizes without duplicates. I'd seen this prior, but was trying to adhere to the warning message in configs/common/packages.yaml
That said, with:
https://github.com/rickgrubin-noaa/spack-stack/tree/SI-1052 which sets up the versions noted above, and
https://github.com/rickgrubin-noaa/spack/tree/SI-1052 which contains spack
packages for:
py-metcalcpy
py-metdataio
py-metplotpy
py-eofs
py-imutils
py-opencv-python
envs for py-metcalcpy
and py-metplotpy
are successfully created:
py-metcalcpy
| /scratch1/NCEPDEV/nems/Richard.Grubin/envs/metcalcpy.hera.intel
py-metplotpy
| /scratch1/NCEPDEV/nems/Richard.Grubin/envs/metplotpy.hera.intel
whereas for
py-metdataio
| /scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel fails to install metdataio
(see log.install
there):
==> Installing py-metdataio-2.1.1-rn733io6rdezerbrtv7jiteimvg4wvoa [69/69]
==> No binary for py-metdataio-2.1.1-rn733io6rdezerbrtv7jiteimvg4wvoa found: installing from source
==> Fetching https://github.com/dtcenter/METdataio/archive/refs/tags/v2.1.1.tar.gz
==> No patches needed for py-metdataio
==> py-metdataio: Executing phase: 'install'
[...]
Running command Preparing metadata (pyproject.toml)
Preparing metadata (pyproject.toml): finished with status 'done'
ERROR: Exception:
Traceback (most recent call last):
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/cli/base_command.py", line 169, in exc_logging_wrapper
status = run_func(*args)
^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/cli/req_command.py", line 248, in wrapper
return func(self, options, args)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/commands/install.py", line 377, in run
requirement_set = resolver.resolve(
^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/resolver.py", line 73, in resolve
collected = self.factory.collect_root_requirements(root_reqs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 491, in collect_root_requirements
req = self._make_requirement_from_install_req(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 453, in _make_requirement_from_install_req
cand = self._make_candidate_from_link(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 206, in _make_candidate_from_link
self._link_candidate_cache[link] = LinkCandidate(
^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 293, in __init__
super().__init__(
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 156, in __init__
self.dist = self._prepare()
^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 225, in _prepare
dist = self._prepare_distribution()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 304, in _prepare_distribution
return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/operations/prepare.py", line 516, in prepare_linked_requirement
return self._prepare_linked_requirement(req, parallel_builds)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/operations/prepare.py", line 631, in _prepare_linked_requirement
dist = _get_prepared_distribution(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/operations/prepare.py", line 69, in _get_prepared_distribution
abstract_dist.prepare_distribution_metadata(
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/distributions/sdist.py", line 61, in prepare_distribution_metadata
self.req.prepare_metadata()
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/req/req_install.py", line 555, in prepare_metadata
self.metadata_directory = generate_metadata(
^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/operations/build/metadata.py", line 35, in generate_metadata
distinfo_dir = backend.prepare_metadata_for_build_wheel(metadata_dir)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/utils/misc.py", line 713, in prepare_metadata_for_build_wheel
return super().prepare_metadata_for_build_wheel(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_impl.py", line 186, in prepare_metadata_for_build_wheel
return self._call_hook('prepare_metadata_for_build_wheel', {
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_impl.py", line 321, in _call_hook
raise BackendUnavailable(data.get('traceback', ''))
pip._vendor.pyproject_hooks._impl.BackendUnavailable: Traceback (most recent call last):
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 77, in _build_backend
obj = import_module(mod_path)
^^^^^^^^^^^^^^^^^^^^^^^
File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/python-3.11.7-m2gbofx/lib/python3.11/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
File "<frozen importlib._bootstrap>", line 1126, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
File "<frozen importlib._bootstrap>", line 1140, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'setuptools'
Not sure where to start! setuptools
is installed in that env, the package is a simple install (effectively identical to metcalcpy
and metplotpy
), and successfully installs into a venv
from the shell level.
I'm new to creating spack
packages, and perhaps have done something incorrectly with metdataio
.
does metdataio have py-setuptools listed as a build-time dependency?
@mkavulich -- a test env can be built for you; questions, please:
is hera an acceptable host?
is a full unified-dev
env appropriate?
Thanks.
@mkavulich -- a test env can be built for you; questions, please:
- is hera an acceptable host? is a full
unified-dev
env appropriate?- is a full
unified-dev
env appropriate?Thanks.
@mkavulich -- how would you like to proceed?
@rickgrubin-noaa Sorry for the delay in getting back to you. Hera is acceptable for testing. I am not sure about the differences between environments; is unified-dev
just the standard build of spack-stack used for UFS purposes?
@rickgrubin-noaa Sorry for the delay in getting back to you. Hera is acceptable for testing. I am not sure about the differences between environments; is
unified-dev
just the standard build of spack-stack used for UFS purposes?
@mkavulich yes -- unified-dev
is the stack against which UFS
is built / tested.
In that case then unified-dev sounds correct. Thanks!
Building the unified-dev
env revealed that py-pyhdf
required a version upgrade (to 0.11.4
) in order to accommodate a newer version of numpy
(>= 1.25.0
). With this update in place:
% module use /scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.8.0-dtc-met/envs/ue-intel-2021.5.0/install/modulefiles/Core
% module load stack-intel/2021.5.0
% module load stack-intel-oneapi-mpi/2021.5.1
% module load ufs-weather-model-env/1.0.0
% module load py-metcalcpy py-metdataio py-metplotpy
% module list
Currently Loaded Modules:
1) intel/2022.1.2 35) esmf/8.6.1 69) py-xlrd/2.0.1 103) proj/9.2.1
2) stack-intel/2021.5.0 36) fms/2024.02 70) py-xlsxwriter/3.1.7 104) py-pyproj/3.6.0
3) impi/2022.1.2 37) libjpeg/2.1.0 71) py-pandas/2.0.3 105) py-traitlets/5.9.0
4) stack-intel-oneapi-mpi/2021.5.1 38) jasper/2.0.32 72) libyaml/0.2.5 106) py-xarray/2023.7.0
5) glibc/2.28 39) libpng/1.6.37 73) py-pyyaml/6.0.1 107) py-metpy/1.0.1
6) bacio/2.4.1 40) w3emc/2.10.0 74) ufs-pyenv/1.0.0 108) py-lazy-loader/0.4
7) nghttp2/1.57.0 41) g2/3.5.1 75) ufs-weather-model-env/1.0.0 109) py-networkx/3.1
8) zlib-ng/2.1.6 42) g2tmpl/1.13.0 76) opencv/4.8.0 110) py-tifffile/2023.8.30
9) curl/8.7.1 43) ip/5.0.0 77) py-eofs/1.4.1 111) py-scikit-image/0.22.0
10) cmake/3.27.9 44) gftl/1.14.0 78) alsa-lib/1.2.3.2 112) py-joblib/1.2.0
11) git/2.18.0 45) gftl-shared/1.9.0 79) yasm/1.3.0 113) py-threadpoolctl/3.1.0
12) hdf5/1.14.3 46) fargparse/1.8.0 80) ffmpeg/6.1.1 114) py-scikit-learn/1.4.0
13) snappy/1.1.10 47) yafyaml/1.4.0 81) py-pillow/10.3.0 115) py-metcalcpy/2.1.0
14) zstd/1.5.2 48) pflogger/1.14.0 82) py-imageio/2.34.0 116) libgpg-error/1.49
15) c-blosc/1.21.5 49) mapl/2.46.3-esmf-8.6.1 83) py-pybind11/2.11.0 117) libgcrypt/1.10.3
16) netcdf-c/4.9.2 50) scotch/7.0.4 84) py-contourpy/1.0.7 118) libxslt/1.1.33
17) nccmp/1.9.0.1 51) openblas/0.3.24 85) py-cycler/0.11.0 119) py-lxml/4.9.2
18) netcdf-fortran/4.6.1 52) py-numpy/1.26.4 86) py-fonttools/4.39.4 120) py-pycparser/2.21
19) parallel-netcdf/1.12.3 53) py-cftime/1.0.3.4 87) py-kiwisolver/1.4.5 121) py-cffi/1.15.1
20) parallelio/2.6.2 54) py-setuptools/69.2.0 88) py-packaging/23.1 122) py-cryptography/38.0.1
21) tar/1.26 55) py-cython/0.29.36 89) py-pyparsing/3.1.2 123) py-pymysql/0.9.2
22) gettext/0.21.1 56) py-f90nml/1.4.3 90) qhull/2020.2 124) py-metdataio/2.1.1
23) libxcrypt/4.4.35 57) py-markupsafe/2.1.3 91) py-matplotlib/3.7.4 125) py-decorator/5.1.1
24) sqlite/3.43.2 58) py-jinja2/3.1.2 92) py-scipy/1.12.0 126) py-fastjsonschema/2.16.3
25) util-linux-uuid/2.38.1 59) py-netcdf4/1.5.8 93) py-imutils/0.5.4 127) py-attrs/21.4.0
26) python/3.11.7 60) py-defusedxml/0.7.1 94) py-kaleido/0.2.1 128) py-pyrsistent/0.19.3
27) python-venv/1.0 61) py-odfpy/1.4.1 95) py-pint/0.19.2 129) py-jsonschema/4.17.3
28) py-pip/23.1.2 62) py-et-xmlfile/1.0.1 96) py-platformdirs/3.10.0 130) py-jupyter-core/5.3.0
29) wget/1.14 63) py-openpyxl/3.1.2 97) py-certifi/2023.7.22 131) py-nbformat/5.8.0
30) base-env/1.0.0 64) py-six/1.16.0 98) py-charset-normalizer/3.3.0 132) py-plotly/2.2.0
31) cprnc/1.0.3 65) py-python-dateutil/2.8.2 99) py-idna/3.4 133) py-metplotpy/2.1.0
32) crtm-fix/2.4.0.1_emc 66) py-pytz/2023.3 100) py-urllib3/1.26.12
33) git-lfs/2.10.0 67) py-pyxlsb/1.0.10 101) py-requests/2.31.0
34) crtm/2.4.0.1 68) py-tzdata/2023.3 102) py-pooch/1.7.0
@mkavulich please give it a try.
@rickgrubin-noaa Sorry it's taking so long, I'm trying to track down a data set needed to run the right tests, I'll let you know as soon as I'm able to run the tests on the new build.
Package name
METcalcpy, METplotpy, METdataio
Package version/tag
v2.1
Build options
none
Installation timeframe
Please install on Hera for testing, and then include in the next release.
Other information
The DTC Agile Framework group is seeking to add plotting of verification statistics to the Short-Range Weather App workflow. This work relies on the three packages of the METplus Analysis Suite: METcalcpy, METplotpy, and METdataio. The latest versions of all these tools are v2.1, as they are released on the same cadence as all MET products.
We do not need these tools installed on WCOSS2, though as I understand it they have already been installed independently for use by EMC, so they should pass all necessary checks if that was desired eventually.
WCOSS2
WCOSS2: General questions
No response
WCOSS2: Installation and testing
No response
WCOSS2: Technical & security review list
WCOSS2: Additional comments
No response