Open ccaefch0523 opened 4 months ago
You don't need to have a rule excluding cp2k - instead it ought to be in the gpu environment.
Don't use my myriad.yaml
directly, instead you need to modify it to use include_concrete
instead of
include:
- $HPC_SPACK_ROOT/spacksites/spack-env-templates/dev1/build/base.yaml
- $HPC_SPACK_ROOT/spacksites/spack-env-templates/dev1/build/gpu.yaml
- $HPC_SPACK_ROOT/spacksites/spack-env-templates/dev1/build/gpu-on-gpu.yaml
so that you make separate concretised environments for base, gpu-on-gpu, gpu and then include those in myriad.yaml
.
(There's a question about which of the environments should include_concrete
which others - so it ends up more like the chaining than what I did).
For example, you concretise and build the base environment, then the gpu-on-gpu environment separately (and on a gpu node), then the gpu environment that includes the gpu-on-gpu environment, and finally the myriad environment that includes everything and which has anything extra that is only for myriad.
That's if we keep the split of how the environments are divided up the same as I initially did - if another split makes more sense, do that instead.
CPU Gromacs should go in base if we're keeping the split the same since CPU Gromacs gets installed everywhere.
I got the most recent gromacs from spack develop: https://github.com/UCL-ARC/hpc-spack/issues/44#issuecomment-2181056707
so we can have
gromacs@2024.2 +double
gromacs@2024.2 +cuda cuda_arch=80
gromacs@2023 +double +plumed
gromacs@2023 +cuda cuda_arch=80 +plumed
Testing --include-concrete
I created a base
environment added perl
and python
concretised and then created another environment testincluded
that includes base
spack env create --include-concrete base testincluded
Spack env list
==> 3 environments
base base_env1 testincluded
Deleted base_env1
spack env remove base_env1
==> Really remove environment base_env1? [y/N] y
==> Successfully removed environment 'base_env1'
spack env list
==> 2 environments
base testincluded
Added nedit
and zlib
to env base
but did not concretise nedit
it
spack -e base find -c
==> In environment base
==> 4 root specs
- nedit [+] perl [+] python [+] zlib
==> Concretized roots
-- linux-rhel7-cascadelake / gcc@11.2.1 -------------------------
perl@5.38.0 zlib@1.3.1
-- linux-rhel7-cascadelake / gcc@12.3.0 -------------------------
python@3.11.6
==> Installed packages
-- linux-rhel7-cascadelake / gcc@11.2.1 -------------------------
berkeley-db@18.1.40 gcc-runtime@11.2.1 glibc@2.17 libxml2@2.10.3 pigz@2.8 tar@1.30
bzip2@1.0.8 gdbm@1.23 gmake@4.4.1 ncurses@6.4 pkgconf@1.9.5 xz@5.4.6
diffutils@3.9 gettext@0.22.5 libiconv@1.17 perl@5.38.0 readline@8.2 zlib@1.3.1
-- linux-rhel7-cascadelake / gcc@12.3.0 -------------------------
ca-certificates-mozilla@2023-05-30 libbsd@0.12.1 libxcrypt@4.4.35 sqlite@3.43.2
expat@2.6.2 libffi@3.4.6 openssl@3.3.0 util-linux-uuid@2.38.1
gcc-runtime@12.3.0 libmd@1.0.4 python@3.11.6
==> 29 installed packages
After concretising the base, I still don't see zlib
and nedit
in testincluded
env . So the change is not reflected in the included environment testincluded
spack -e testincluded find -c
==> In environment testincluded
==> No root specs
==> Included specs
perl python
==> Concretized roots
==> Installed packages
-- linux-rhel7-cascadelake / gcc@11.2.1 -------------------------
berkeley-db@18.1.40 gcc-runtime@11.2.1 glibc@2.17 libxml2@2.10.3 pigz@2.8 tar@1.30
bzip2@1.0.8 gdbm@1.23 gmake@4.4.1 ncurses@6.4 pkgconf@1.9.5 xz@5.4.6
diffutils@3.9 gettext@0.22.5 libiconv@1.17 perl@5.38.0 readline@8.2 zlib@1.3.1
-- linux-rhel7-cascadelake / gcc@12.3.0 -------------------------
ca-certificates-mozilla@2023-05-30 libbsd@0.12.1 libxcrypt@4.4.35 sqlite@3.43.2
expat@2.6.2 libffi@3.4.6 openssl@3.3.0 util-linux-uuid@2.38.1
gcc-runtime@12.3.0 libmd@1.0.4 python@3.11.6
==> 29 installed packages
I need to reconcretize the env testincluded
spack -e testincluded concretize
(spacksite: fc-myriad-s0.22-test) [ccspapp@build01 hpc-spack]$ spack -e testincluded find -c
==> In environment testincluded
==> No root specs
==> Included specs
nedit perl python zlib
==> Concretized roots
==> Installed packages
-- linux-rhel7-cascadelake / gcc@11.2.1 -------------------------
autoconf@2.69 flex@2.6.3 kbproto@1.0.7 libxau@1.0.11 perl@5.38.0 xproto@7.0.31
automake@1.16.5 fontsproto@2.1.3 libfontenc@1.1.8 libxcb@1.16 pigz@2.8 xtrans@1.5.0
bdftopcf@1.1 gcc-runtime@11.2.1 libice@1.1.1 libxdmcp@1.1.4 pkgconf@1.9.5 xz@5.4.6
berkeley-db@18.1.40 gdbm@1.23 libiconv@1.17 libxfont@1.5.4 readline@8.2 zlib@1.3.1
bzip2@1.0.8 gettext@0.22.5 libpthread-stubs@0.5 libxml2@2.10.3 renderproto@0.11.1
compositeproto@0.4.2 glibc@2.17 libsigsegv@2.14 m4@1.4.19 tar@1.30
diffutils@3.9 gmake@4.4.1 libsm@1.2.4 mkfontdir@1.0.7 xbitmaps@1.1.3
findutils@4.9.0 gperf@3.1 libtool@2.4.7 mkfontscale@1.2.3 xcb-proto@1.16.0
fixesproto@5.0 inputproto@2.3.2 libx11@1.8.7 ncurses@6.4 xextproto@7.3.0
-- linux-rhel7-cascadelake / gcc@12.3.0 -------------------------
bison@3.8.2 gcc-runtime@12.3.0 libxcrypt@4.4.35 sqlite@3.43.2
ca-certificates-mozilla@2023-05-30 libbsd@0.12.1 nasm@2.15.05 util-linux-uuid@2.38.1
cmake@3.27.9 libffi@3.4.6 nghttp2@1.57.0 util-macros@1.19.3
curl@8.7.1 libjpeg-turbo@3.0.0 openssl@3.3.0
expat@2.6.2 libmd@1.0.4 python@3.11.6
freetype@2.13.2 libpng@1.6.39 python-venv@1.0
==> 70 installed packages
I need to test removing a spec from base
Testing creating an independent environment via an environment files ../spack.yam
placed in a directory.
spack env create -d new-code ../spack.yaml
==> Updating view at /lustre/scratch/scratch/ccspapp/spack/0.22/hpc-spack/fctestenv/new-code/.spack-env/view
==> Created independent environment in: /lustre/scratch/scratch/ccspapp/spack/0.22/hpc-spack/fctestenv/new-code
==> Activate with: spack env activate new-code
(spacksite: fc-myriad-s0.22-test) [ccspapp@build01 fctestenv]$ spack env list
==> 3 environments
base base_env1 testincluded
(spacksite: fc-myriad-s0.22-test) [ccspapp@build01 fctestenv]$ ls new-code
the new env in the directory new-code
exists
(spacksite: fc-myriad-s0.22-test) [ccspapp@build01 fctestenv]$ spack -e new-code find
==> In environment /lustre/scratch/scratch/ccspapp/spack/0.22/hpc-spack/fctestenv/new-code
==> No root specs
==> Included specs
perl python
==> Installed packages
-- linux-rhel7-cascadelake / gcc@11.2.1 -------------------------
berkeley-db@18.1.40 gcc-runtime@11.2.1 glibc@2.17 libxml2@2.10.3 pigz@2.8 tar@1.30
bzip2@1.0.8 gdbm@1.23 gmake@4.4.1 ncurses@6.4 pkgconf@1.9.5 xz@5.4.6
diffutils@3.9 gettext@0.22.5 libiconv@1.17 perl@5.38.0 readline@8.2 zlib@1.3.1
-- linux-rhel7-cascadelake / gcc@12.3.0 -------------------------
ca-certificates-mozilla@2023-05-30 libbsd@0.12.1 libxcrypt@4.4.35 sqlite@3.43.2
expat@2.6.2 libffi@3.4.6 openssl@3.3.0 util-linux-uuid@2.38.1
gcc-runtime@12.3.0 libmd@1.0.4 python@3.11.6
==> 29 installed packages
Our existing specs are split into four parts (myriad, base, gpu, gpu-on-gpu). The process for doing this and yaml files for these four parts need to be created, and the yaml files added to the repo.
Q: which environments should include which of the others to completely create this structure?
Once this is working, then we can add the final packages to whichever of those files is appropriate.
Creating an env for Gromacs/plumed/ GPU (versions are listed below) that included base (needs editing)
spacksites/spack-env-templates/dev1/build/myriad.yaml
gromacs@2024.2 +cuda cuda_arch=80 +plumed
gromacs@2023 +cuda cuda_arch=80 +plumed
spack env create --include-concrete base gromacsgpu
==> Updating view at /lustre/shared/ucl/apps/spack/0.22/fc-myriad-s0.22-test/spack/var/spack/environments/gromacsgpu/.spack-env/view
==> Created environment gromacsgpu in: /lustre/shared/ucl/apps/spack/0.22/fc-myriad-s0.22-test/spack/var/spack/environments/gromacsgpu
==> Activate with: spack env activate gromacsgpu
adding Gromacs
spack -e gromacsgpu add gromacs@2023 +cuda+plumed cuda_arch=80
spack -e gromacsgpu find -c
==> In environment gromacsgpu
==> 1 root specs
- gromacs@2023 +cuda+plumed cuda_arch=80
==> Included specs
-- no arch / gcc@12.3.0 -----------------------------------------
beast2@2.7.4%gcc@12.3.0 gzip@1.13%gcc@12.3.0 netcdf-c@4.9.2%gcc@12.3.0 ~mpi python@3.11.6%gcc@12.3.0
bedtools2@2.31.0%gcc@12.3.0 hdf5@1.14.3%gcc@12.3.0 +cxx+fortran+hl~mpi netcdf-fortran@4.6.1%gcc@12.3.0 samtools@1.17%gcc@12.3.0
bwa@0.7.17%gcc@12.3.0 hdf5@1.14.3%gcc@12.3.0 +fortran+hl+mpi openmpi@4.1.6%gcc@12.3.0 fabrics=cma,ofi,psm2,ucx schedulers=sge vcftools@0.1.16%gcc@12.3.0
gatk@4.4.0.0%gcc@12.3.0 htslib@1.17%gcc@12.3.0 picard@2.26.2%gcc@12.3.0
==> Concretized roots
-- linux-rhel7-cascadelake / gcc@11.2.1 -------------------------
gromacs@2023
==> Installed packages
-- linux-rhel7-cascadelake / gcc@11.2.1 -------------------------
autoconf@2.69 berkeley-db@18.1.40 diffutils@3.9 gcc-runtime@11.2.1 gettext@0.22.5 gmake@4.4.1 libsigsegv@2.14 libxml2@2.10.3 ncurses@6.4 pigz@2.8 readline@8.2 xz@5.4.6
automake@1.16.5 bzip2@1.0.8 findutils@4.9.0 gdbm@1.23 glibc@2.17 libiconv@1.17 libtool@2.4.7 m4@1.4.19 perl@5.38.0 pkgconf@1.9.5 tar@1.30 zlib@1.3.1
-- linux-rhel7-cascadelake / gcc@12.3.0 -------------------------
bison@3.8.2 cmake@3.27.9 expat@2.6.2 libbsd@0.12.1 libmd@1.0.4 nghttp2@1.57.0 python@3.11.6 sqlite@3.43.2 util-macros@1.19.3
ca-certificates-mozilla@2023-05-30 curl@8.7.1 gcc-runtime@12.3.0 libffi@3.4.6 libxcrypt@4.4.35 openssl@3.3.0 python-venv@1.0 util-linux-uuid@2.38.1
==> 41 installed packages
I have altered these yaml files to be the include_concrete versions:
modified: spacksites/spack-env-templates/dev1/build/base.yaml
modified: spacksites/spack-env-templates/dev1/build/gpu-on-gpu.yaml
modified: spacksites/spack-env-templates/dev1/build/gpu.yaml
modified: spacksites/spack-env-templates/dev1/build/myriad.yaml
Right now I have myriad.yaml including all of the other envs, but gpu including base and gpu-on-gpu including gpu. Need to test how that works.
I have also added the casteps and cpu gromacses to base.yaml and the gpu gromacses to gpu.yaml. (Don't know if that needs to be gpu-on-gpu.yaml).
Starting with (in existing hk-initial-stack
site):
spack env create base /home/ccspapp/Scratch/spack/0.22/hpc-spack/spacksites/spack-env-templates/dev1/build/base.yaml
==> Created environment base in: /lustre/shared/ucl/apps/spack/0.22/hk-initial-stack/spack/var/spack/environments/base
==> Activate with: spack env activate base
spack -e base concretize -f
...
spack -e base install
castep and gromacs aren't in the build cache yet so there are some things to build, starting with gsl.
==> Installing gsl-2.7.1-efih7w3hrez4pzjrvad5m3qsxks5suk4 [15/98]
==> No binary for gsl-2.7.1-efih7w3hrez4pzjrvad5m3qsxks5suk4 found: installing from source
==> Fetching https://mirror.spack.io/_source-cache/archive/dc/dcb0fbd43048832b757ff9942691a8dd70026d5da0ff85601e52687f6deeb34b.tar.gz
==> No patches needed for gsl
==> gsl: Executing phase: 'autoreconf'
==> gsl: Executing phase: 'configure'
==> gsl: Executing phase: 'build'
==> gsl: Executing phase: 'install'
==> gsl: Successfully installed gsl-2.7.1-efih7w3hrez4pzjrvad5m3qsxks5suk4
Stage: 3.84s. Autoreconf: 0.00s. Configure: 23.99s. Build: 1m 47.60s. Install: 4.36s. Post-install: 1.65s. Total: 2m 22.14s
GROMACS 2024.2 on its own and Plumed 2.9.0 were fine.
==> Installing gromacs-2024.2-twuqinqf5khu2f33546sglpff5d5n6lj [93/98]
==> No binary for gromacs-2024.2-twuqinqf5khu2f33546sglpff5d5n6lj found: installing from source
==> Fetching https://mirror.spack.io/_source-cache/archive/80/802a7e335f2e895770f57b159e4ec368ebb0ff2ce6daccf706c6e8025c36852b.tar.gz
==> Ran patch() for gromacs
==> gromacs: Executing phase: 'cmake'
==> gromacs: Executing phase: 'build'
==> gromacs: Executing phase: 'install'
==> gromacs: Successfully installed gromacs-2024.2-twuqinqf5khu2f33546sglpff5d5n6lj
Stage: 13.22s. Cmake: 1m 10.63s. Build: 5m 8.55s. Install: 6.87s. Post-install: 3.51s. Total: 6m 59.72s
==> Installing plumed-2.9.0-qfmsmjb6zdox6u5unwdkosxgn3bzqoul [95/98]
==> No binary for plumed-2.9.0-qfmsmjb6zdox6u5unwdkosxgn3bzqoul found: installing from source
==> Fetching https://mirror.spack.io/_source-cache/archive/61/612d2387416b5f82dd8545709921440370e144fd46cef633654cf0ee43bac5f8.tar.gz
==> Ran patch() for plumed
==> plumed: Executing phase: 'autoreconf'
==> plumed: Executing phase: 'configure'
==> plumed: Executing phase: 'build'
==> plumed: Executing phase: 'install'
==> plumed: Successfully installed plumed-2.9.0-qfmsmjb6zdox6u5unwdkosxgn3bzqoul
Stage: 14.17s. Autoreconf: 7.77s. Configure: 56.21s. Build: 17m 59.95s. Install: 45.18s. Post-install: 1.76s. Total: 20m 7.35s
Need to check on the patches for GROMACS 2023 (looks minor):
==> Installing gromacs-2023-i3nsau23qwhkg3c44ehsor3izfiacdqf [97/98]
==> No binary for gromacs-2023-i3nsau23qwhkg3c44ehsor3izfiacdqf found: installing from source
==> Fetching https://mirror.spack.io/_source-cache/archive/ac/ac92c6da72fbbcca414fd8a8d979e56ecf17c4c1cdabed2da5cfb4e7277b7ba8.tar.gz
NOTE: shell only version, useful when plumed is cross compiled
NOTE: shell only version, useful when plumed is cross compiled
PLUMED patching tool
MD engine: gromacs-2023
PLUMED location: /lustre/shared/ucl/apps/spack/0.22/hk-initial-stack/spack/opt/spack/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/linux-rhel7-cascadelake/gcc-12.3.0/plumed-2.9.0-qfmsmjb6zdox6u5unwdkosxgn3bzqoul/lib/plumed
diff file: /lustre/shared/ucl/apps/spack/0.22/hk-initial-stack/spack/opt/spack/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/linux-rhel7-cascadelake/gcc-12.3.0/plumed-2.9.0-qfmsmjb6zdox6u5unwdkosxgn3bzqoul/lib/plumed/patches/gromacs-2023.diff
sourcing config file: /lustre/shared/ucl/apps/spack/0.22/hk-initial-stack/spack/opt/spack/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/linux-rhel7-cascadelake/gcc-12.3.0/plumed-2.9.0-qfmsmjb6zdox6u5unwdkosxgn3bzqoul/lib/plumed/patches/gromacs-2023.config
Executing plumed_before_patch function
PLUMED can be incorporated into gromacs using the standard patching procedure.
Patching must be done in the gromacs root directory _before_ the cmake command is invoked.
On clusters you may want to patch gromacs using the static version of plumed, in this case
building gromacs can result in multiple errors. One possible solution is to configure gromacs
with these additional options:
cmake -DBUILD_SHARED_LIBS=OFF -DGMX_PREFER_STATIC_LIBS=ON
To enable PLUMED in a gromacs simulation one should use
mdrun with an extra -plumed flag. The flag can be used to
specify the name of the PLUMED input file, e.g.:
gmx mdrun -plumed plumed.dat
For more information on gromacs you should visit http://www.gromacs.org
Linking Plumed.h, Plumed.inc, and Plumed.cmake (shared mode)
Patching with on-the-fly diff from stored originals
patching file ./cmake/gmxVersionInfo.cmake
Hunk #1 FAILED at 257.
1 out of 1 hunk FAILED -- saving rejects to file ./cmake/gmxVersionInfo.cmake.rej
patching file ./src/gromacs/CMakeLists.txt
patching file ./src/gromacs/mdlib/expanded.cpp
patching file ./src/gromacs/mdlib/expanded.h
patching file ./src/gromacs/mdlib/sim_util.cpp
patching file ./src/gromacs/mdrun/legacymdrunoptions.cpp
patching file ./src/gromacs/mdrun/legacymdrunoptions.h
patching file ./src/gromacs/mdrun/md.cpp
patching file ./src/gromacs/mdrun/minimize.cpp
patching file ./src/gromacs/mdrun/replicaexchange.cpp
patching file ./src/gromacs/mdrun/replicaexchange.h
patching file ./src/gromacs/mdrun/rerun.cpp
patching file ./src/gromacs/mdrun/runner.cpp
patching file ./src/gromacs/modularsimulator/expandedensembleelement.cpp
patching file ./src/gromacs/taskassignment/decidegpuusage.cpp
patching file ./src/gromacs/taskassignment/include/gromacs/taskassignment/decidegpuusage.h
PLUMED is compiled with MPI support so you can configure gromacs-2023 with MPI
==> Ran patch() for gromacs
==> gromacs: Executing phase: 'cmake'
==> gromacs: Executing phase: 'build'
==> gromacs: Executing phase: 'install'
==> gromacs: Successfully installed gromacs-2023-i3nsau23qwhkg3c44ehsor3izfiacdqf
Stage: 12.95s. Cmake: 49.70s. Build: 4m 56.29s. Install: 6.54s. Post-install: 1.67s. Total: 6m 8.45s
Plumed 2.9.0: "Patch for GROMACS 2023 (preliminary, in particular for replica-exchange, expanded ensemble, hrex features)." Plumed 2.9.2: "Patch for GROMACS 2023 updated to the latest version"
=> we want the 2.9.2 plumed for GROMACS 2023.x instead.
The actual failure was only the version patching in cmake as in https://github.com/plumed/plumed2/issues/960#issuecomment-1625413258 but other functionality may not be there.
Have added a repos/ucl
package for most recent plumed, updated the gromacs one in there and added knowledge of plumed 2.9.2 to it.
Now want these:
- gromacs@2024.3 +double
- gromacs@2023.5 +double +plumed
Ok, we now have these in my base
packages:
==> plumed: Successfully installed plumed-2.9.2-gf2kz5qj4deszk3dtpqvsaaq7uio3a45
Stage: 1m 54.07s. Autoreconf: 13.22s. Configure: 4m 25.75s. Build: 20m 29.53s. Install: 1m 19.79s. Post
-install: 5.50s. Total: 28m 53.62s
==> gromacs: Successfully installed gromacs-2024.3-ngcyv5lxlqtz4dkmsbwhsjd74qu3xjia
Stage: 3m 19.39s. Cmake: 2m 18.77s. Build: 5m 32.52s. Install: 36.05s. Post-install: 8.22s. Total: 11m 59.54s
==> gromacs: Successfully installed gromacs-2023.5-hh7utvxt52oak37vwcbyayu3sdlhv2bf
Stage: 1m 23.56s. Cmake: 1m 59.95s. Build: 5m 21.73s. Install: 35.91s. Post-install: 3.51s. Total: 9m 27.45s
No patching issues.
LAMMPS:
- lammps@20240829 +mpi +python +amoeba +asphere +bocs +body +bpm +brownian +cg-dna +cg-spica +class2 +colloid +colvars +compress +coreshell +dielectric +diffraction +dipole +dpd-basic +dpd-meso +dpd-react +dpd-smooth +drude +eff +electrode +extra-compute +extra-dump +extra-fix +extra-molecule +extra-pair +fep +granular +interlayer +kspace +lepton +machdyn +manybody +mc +meam +mesont +misc +ml-iap +ml-pod +ml-snap +ml-uf3 +mofff +molecule +openmp +opt +orient +peri +phonon +plugin +poems +qeq +reaction +reaxff +replica +rigid +rheo +shock +sph +spin +srd +tally +uef +voronoi +yaff
Added +ml-uf3
and +rheo
since available in 2024
Check if want +kokkos
for CUDA version or +gpu
(or both as two separate installs, since we have had requests for how to build lammps+kokkos):
if "~kokkos" in spec:
# LAMMPS can be build with the GPU package OR the KOKKOS package
# Using both in a single build is discouraged.
# +cuda only implies that one of the two is used
# by default it will use the GPU package if kokkos wasn't enabled
NAMD:
Default is the TCL interface rather than python and that is how we currently build.
namd@2.14
namd@3.0
CUDA:
# myriad
namd@2.14 +cuda cuda_arch="70,80" +single_node_gpu
CPU lammps built, CPU NAMD 2.14 built, but charmpp 7.0.0 for NAMD 3.0 failed:
==> Installing charmpp-7.0.0-ajpis4lxw5ielvvr22qcbakeby4va4xt [70/100]
==> No binary for charmpp-7.0.0-ajpis4lxw5ielvvr22qcbakeby4va4xt found: installing from source
==> Using cached archive: /lustre/shared/ucl/apps/spack/0.22/hk-initial-stack/spack/var/spack/cache/_source-ca
che/archive/9c/9c247b421bb157bdf9bc0ced3e25738c7a1dc1f7ec57b7943a7faf97f7e4fb2e.tar.gz
==> No patches needed for charmpp
==> charmpp: Executing phase: 'install'
==> Error: ProcessError: Command exited with status 2:
'./build' 'LIBS' 'netlrts-linux-x86_64' 'gcc' 'gfortran' '-j6' '--destination=/lustre/shared/ucl/apps/spac
k/0.22/hk-initial-stack/spack/opt/spack/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_pla
ceholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_place
holder__/linux-rhel7-cascadelake/gcc-12.3.0/charmpp-7.0.0-ajpis4lxw5ielvvr22qcbakeby4va4xt' 'smp' '--build-sha
red' '--with-production'
7 errors found in build log:
1528 [ 96%] Building CXX object src/libs/ck-libs/ampi/CMakeFiles/moduleampif.dir/ddt.C.o
1529 [ 96%] Building CXX object src/libs/ck-libs/ampi/CMakeFiles/moduleampif.dir/mpich-alltoall.C.o
1530 [ 96%] Building CXX object src/libs/ck-libs/ampi/CMakeFiles/moduleampif.dir/ampi_mpix.C.o
1531 [ 96%] Building CXX object src/libs/ck-libs/ampi/CMakeFiles/moduleampif.dir/ampi_noimpl.C.o
1532 [ 96%] Linking CXX static library ../../../../lib/libmoduleampi.a
1533 [ 96%] Linking CXX static library ../../../../lib/libmoduleampif.a
>> 1534 Fatal Error by charmc in directory /lustre/shared/ucl/apps/spack/0.22/hk-initial-stack/spack/opt
/spack/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_
path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholde
r__/linux-rhel7-cascadelake/gcc-12.3.0/charmpp-7.0.0-ajpis4lxw5ielvvr22qcbakeby4va4xt/src/libs/c
k-libs/ampi
1535 Trying to link, but no object files or library archives were specified
1536 charmc exiting...
1535 Trying to link, but no object files or library archives were specified
1536 charmc exiting...
>> 1537 make[2]: *** [src/libs/ck-libs/ampi/CMakeFiles/moduleampi.dir/build.make:547: lib/libmoduleampi.
a] Error 1
>> 1538 make[1]: *** [CMakeFiles/Makefile2:3470: src/libs/ck-libs/ampi/CMakeFiles/moduleampi.dir/all] Er
ror 2
1539 make[1]: *** Waiting for unfinished jobs....
>> 1540 Fatal Error by charmc in directory /lustre/shared/ucl/apps/spack/0.22/hk-initial-stack/spack/opt
/spack/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_
path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholde
r__/linux-rhel7-cascadelake/gcc-12.3.0/charmpp-7.0.0-ajpis4lxw5ielvvr22qcbakeby4va4xt/src/libs/c
k-libs/ampi
1541 Trying to link, but no object files or library archives were specified
1542 charmc exiting...
>> 1543 make[2]: *** [src/libs/ck-libs/ampi/CMakeFiles/moduleampif.dir/build.make:579: lib/libmoduleampi
f.a] Error 1
>> 1544 make[1]: *** [CMakeFiles/Makefile2:3501: src/libs/ck-libs/ampi/CMakeFiles/moduleampif.dir/all] E
rror 2
>> 1545 make: *** [Makefile:136: all] Error 2
See build log for details:
/lustre/shared/ucl/apps/spack/0.22/hk-initial-stack/build_stage/ccspapp/spack-stage-charmpp-7.0.0-ajpis4lxw5ielvvr22qcbakeby4va4xt/spack-build-out.txt
==> Warning: Skipping build of namd-3.0-sthub4raczji5xmf63ngdux3lzrmtpea since charmpp-7.0.0-ajpis4lxw5ielvvr22qcbakeby4va4xt failed
I created a new spacksite in Myriad using Spack 0.22 on the
build01
node:gpg key trust done so I can use the buildcache
but then I had an empty list of the buildcache:
so I tried updating the indexing without the flag
-d
and it works!Experimenting with the
myriad.yaml
generated by Heather see https://github.com/UCL-ARC/hpc-spack/issues/56I activated my env
myproject
Adding Gromacs to the spec in my env
myproject
and then concretiseGromacs is added to the root specs
I need to define a rule to exclude installing/concretising cp2k in my env ???