Closed mkandes closed 11 months ago
We need to begin by updating the GROMACS Spack package currently available to include the latest versions of GROMACS.
[mkandes@login01 ~]$ spack --version
0.17.3
[mkandes@login01 ~]$ which spack
alias spack='spack --config-scope /home/mkandes/.spack/0.17.3/gpu/b/'
spack ()
{
: this is a shell function from: /cm/shared/apps/spack/0.17.3/gpu/b/share/spack/setup-env.sh;
: the real spack script is here: /cm/shared/apps/spack/0.17.3/gpu/b/bin/spack;
_spack_shell_wrapper "$@";
return $?
}
[mkandes@login01 ~]$ spack info gromacs
CMakePackage: gromacs
Description:
GROMACS (GROningen MAchine for Chemical Simulations) is a molecular
dynamics package primarily designed for simulations of proteins, lipids
and nucleic acids. It was originally developed in the Biophysical
Chemistry department of University of Groningen, and is now maintained
by contributors in universities and research centers across the world.
GROMACS is one of the fastest and most popular software packages
available and can run on CPUs as well as GPUs. It is free, open source
released under the GNU General Public License. Starting from version
4.6, GROMACS is released under the GNU Lesser General Public License.
Homepage: http://www.gromacs.org
Maintainers: @junghans @marvinbernhardt
Externally Detectable:
False
Tags:
None
Preferred version:
2021.3 https://ftp.gromacs.org/gromacs/gromacs-2021.3.tar.gz
Safe versions:
master [git] https://github.com/gromacs/gromacs.git on branch master
2021.3 https://ftp.gromacs.org/gromacs/gromacs-2021.3.tar.gz
2021.2 https://ftp.gromacs.org/gromacs/gromacs-2021.2.tar.gz
2021.1 https://ftp.gromacs.org/gromacs/gromacs-2021.1.tar.gz
2021 https://ftp.gromacs.org/gromacs/gromacs-2021.tar.gz
2020.6 https://ftp.gromacs.org/gromacs/gromacs-2020.6.tar.gz
2020.5 https://ftp.gromacs.org/gromacs/gromacs-2020.5.tar.gz
2020.4 https://ftp.gromacs.org/gromacs/gromacs-2020.4.tar.gz
2020.3 https://ftp.gromacs.org/gromacs/gromacs-2020.3.tar.gz
2020.2 https://ftp.gromacs.org/gromacs/gromacs-2020.2.tar.gz
2020.1 https://ftp.gromacs.org/gromacs/gromacs-2020.1.tar.gz
2020 https://ftp.gromacs.org/gromacs/gromacs-2020.tar.gz
2019.6 https://ftp.gromacs.org/gromacs/gromacs-2019.6.tar.gz
2019.5 https://ftp.gromacs.org/gromacs/gromacs-2019.5.tar.gz
2019.4 https://ftp.gromacs.org/gromacs/gromacs-2019.4.tar.gz
2019.3 https://ftp.gromacs.org/gromacs/gromacs-2019.3.tar.gz
2019.2 https://ftp.gromacs.org/gromacs/gromacs-2019.2.tar.gz
2019.1 https://ftp.gromacs.org/gromacs/gromacs-2019.1.tar.gz
2019 https://ftp.gromacs.org/gromacs/gromacs-2019.tar.gz
2018.8 https://ftp.gromacs.org/gromacs/gromacs-2018.8.tar.gz
2018.5 https://ftp.gromacs.org/gromacs/gromacs-2018.5.tar.gz
2018.4 https://ftp.gromacs.org/gromacs/gromacs-2018.4.tar.gz
2018.3 https://ftp.gromacs.org/gromacs/gromacs-2018.3.tar.gz
2018.2 https://ftp.gromacs.org/gromacs/gromacs-2018.2.tar.gz
2018.1 https://ftp.gromacs.org/gromacs/gromacs-2018.1.tar.gz
2018 https://ftp.gromacs.org/gromacs/gromacs-2018.tar.gz
2016.6 https://ftp.gromacs.org/gromacs/gromacs-2016.6.tar.gz
2016.5 https://ftp.gromacs.org/gromacs/gromacs-2016.5.tar.gz
2016.4 https://ftp.gromacs.org/gromacs/gromacs-2016.4.tar.gz
2016.3 https://ftp.gromacs.org/gromacs/gromacs-2016.3.tar.gz
5.1.5 https://ftp.gromacs.org/gromacs/gromacs-5.1.5.tar.gz
5.1.4 https://ftp.gromacs.org/gromacs/gromacs-5.1.4.tar.gz
5.1.2 https://ftp.gromacs.org/gromacs/gromacs-5.1.2.tar.gz
4.6.7 https://ftp.gromacs.org/gromacs/gromacs-4.6.7.tar.gz
4.5.5 https://ftp.gromacs.org/gromacs/gromacs-4.5.5.tar.gz
Deprecated versions:
None
Variants:
Name [Default] When Allowed values Description
============================== ==== ==================== ================
blas [off] -- on, off Enables an
external BLAS
library
build_type [RelWithDebInfo] [, ] Debug, Release, The build type
RelWithDebInfo, to build
MinSizeRel,
Reference,
RelWithAssert,
Profile
cuda [off] -- on, off Enable CUDA
support
cycle_subcounters [off] -- on, off Enables cycle
subcounters
double [off] -- on, off Produces a
double precision
version of the
executables
hwloc [on] -- on, off Use the hwloc
portable
hardware
locality library
ipo [off] -- on, off CMake
interprocedural
optimization
lapack [off] -- on, off Enables an
external LAPACK
library
mdrun_only [off] -- on, off Enables the
build of a cut-
down version of
libgromacs
and/or the mdrun
program
mpi [on] -- on, off Activate MPI
support (disable
for Thread-MPI
support)
nosuffix [off] -- on, off Disable default
suffixes
opencl [off] -- on, off Enable OpenCL
support
openmp [on] -- on, off Enables OpenMP
at configure
time
plumed [off] -- on, off Enable PLUMED
support
relaxed_double_precision [off] -- on, off GMX_RELAXED_DOUB
LE_PRECISION,
use only for
Fujitsu PRIMEHPC
shared [on] -- on, off Enables the
build of shared
libraries
sycl [off] -- on, off Enable SYCL
support
Installation Phases:
cmake build install
Build Dependencies:
blas cmake cuda fftw-api hwloc lapack mpi plumed sycl
Link Dependencies:
blas cuda fftw-api hwloc lapack mpi plumed sycl
Run Dependencies:
None
Virtual Packages:
None
[mkandes@login01 ~]$
The current Spack package available in the sdsc-0.17.3
deployment branch is from the builtins
package repo.
[mkandes@login01 ~]$ spack find -lvdN gromacs
==> 1 installed package
-- linux-rocky8-cascadelake / gcc@10.2.0 ------------------------
tdngw6l builtin.gromacs@2020.4+blas+cuda~cycle_subcounters~double+hwloc~ipo+lapack~mdrun_only+mpi~nosuffix~opencl+openmp+plumed~relaxed_double_precision+shared~sycl build_type=RelWithDebInfo
blza2ps builtin.cuda@11.2.2~dev
2q4yola builtin.libxml2@2.9.12~python
5a3xt3s builtin.libiconv@1.16 libs=shared,static
5xho2dj builtin.xz@5.2.5~pic libs=shared,static
2c5fvip builtin.zlib@1.2.11+optimize+pic+shared
7ahyh5v builtin.fftw@3.3.10~mpi~openmp~pfft_patches precision=double,float
okiyq35 builtin.hwloc@2.6.0~cairo+cuda~gl~libudev+libxml2~netloc~nvml~opencl+pci~rocm+shared
x3u2rw4 builtin.libpciaccess@0.16
5jrknc3 builtin.ncurses@6.2~symlinks+termlib abi=none
lsmegf6 builtin.openblas@0.3.18~bignuma~consistent_fpcsr~ilp64+locking+pic+shared threads=none
gzzscfu sdsc.openmpi@4.1.3~atomics+cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java+legacylaunchers+lustre~memchecker+pmi+pmix+romio~rsh~singularity+static+vt+wrapper-rpath cuda_arch=70,80 fabrics=ucx schedulers=slurm
ne2joyw builtin.libevent@2.1.8~openssl
73aggpy builtin.lustre@2.15.2
34rinp4 builtin.numactl@2.0.14 patches=4e1d78cbbb85de625bad28705e748856033eaafab92a66dffd383a3d7e00cc94,62fc8a8bf7665a60e8f4c93ebbd535647cebf74198f7afafec4c085a8825c006,ff37630df599cfabf0740518b91ec8daaf18e8f288b19adaae5364dc1f6b2296
43uenl2 builtin.pmix@3.2.1~docs+pmi_backwards_compatibility~restful
nflzb3l builtin.slurm@21.08.8~gtk~hdf5~hwloc~mariadb~pmix+readline~restd sysconfdir=PREFIX/etc
msro2p7 builtin.ucx@1.10.1~assertions~cm+cma+cuda+dc~debug+dm+gdrcopy+ib-hw-tm~java~knem~logging+mlx5-dv+optimizations~parameter_checking+pic+rc~rocm+thread_multiple+ud~xpmem cuda_arch=70,80
nefrbxp builtin.gdrcopy@2.2
k7hbyv2 builtin.rdma-core@43.0~ipo build_type=RelWithDebInfo
fxev5vk builtin.plumed@2.6.3+gsl+mpi+shared arrayfire=none optional_modules=all
aji2yx5 builtin.gsl@2.7~external-cblas
[mkandes@login01 ~]$
Download builtins
package files into local test repo and add new version numbers and their hashes only.
[mkandes@login01 packages]$ ls
amber
[mkandes@login01 packages]$ pwd
/home/mkandes/.spack/0.17.3/gpu/b/var/spack/repos/mkandes/packages
[mkandes@login01 packages]$ mkdir gromacs
[mkandes@login01 packages]$ cd gromacs/
[mkandes@login01 gromacs]$ wget https://raw.githubusercontent.com/sdsc/spack/67e74f0e081b45fa82d7ff097a03e4cd5b3757ea/var/spack/repos/builtin/packages/gromacs/package.py
--2023-09-06 19:41:56-- https://raw.githubusercontent.com/sdsc/spack/67e74f0e081b45fa82d7ff097a03e4cd5b3757ea/var/spack/repos/builtin/packages/gromacs/package.py
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.109.133, 185.199.108.133, 185.199.110.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.109.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 20761 (20K) [text/plain]
Saving to: ‘package.py’
package.py 100%[===================>] 20.27K --.-KB/s in 0s
2023-09-06 19:41:56 (158 MB/s) - ‘package.py’ saved [20761/20761]
[mkandes@login01 gromacs]$ wget https://raw.githubusercontent.com/sdsc/spack/67e74f0e081b45fa82d7ff097a03e4cd5b3757ea/var/spack/repos/builtin/packages/gromacs/gmxDetectCpu-cmake-3.14.patch
--2023-09-06 19:42:13-- https://raw.githubusercontent.com/sdsc/spack/67e74f0e081b45fa82d7ff097a03e4cd5b3757ea/var/spack/repos/builtin/packages/gromacs/gmxDetectCpu-cmake-3.14.patch
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.108.133, 185.199.111.133, 185.199.109.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.108.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 728 [text/plain]
Saving to: ‘gmxDetectCpu-cmake-3.14.patch’
gmxDetectCpu-cmake- 100%[===================>] 728 --.-KB/s in 0s
2023-09-06 19:42:14 (52.4 MB/s) - ‘gmxDetectCpu-cmake-3.14.patch’ saved [728/728]
[mkandes@login01 gromacs]$ wget https://raw.githubusercontent.com/sdsc/spack/67e74f0e081b45fa82d7ff097a03e4cd5b3757ea/var/spack/repos/builtin/packages/gromacs/gmxDetectSimd-cmake-3.14.patch
--2023-09-06 19:42:24-- https://raw.githubusercontent.com/sdsc/spack/67e74f0e081b45fa82d7ff097a03e4cd5b3757ea/var/spack/repos/builtin/packages/gromacs/gmxDetectSimd-cmake-3.14.patch
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.110.133, 185.199.109.133, 185.199.111.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.110.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 600 [text/plain]
Saving to: ‘gmxDetectSimd-cmake-3.14.patch’
gmxDetectSimd-cmake 100%[===================>] 600 --.-KB/s in 0s
2023-09-06 19:42:25 (62.0 MB/s) - ‘gmxDetectSimd-cmake-3.14.patch’ saved [600/600]
[mkandes@login01 gromacs]$ vi package.py
[mkandes@login01 gromacs]$ vi package.py
[mkandes@login01 gromacs]$ spack info gromacs
CMakePackage: gromacs
Description:
GROMACS (GROningen MAchine for Chemical Simulations) is a molecular
dynamics package primarily designed for simulations of proteins, lipids
and nucleic acids. It was originally developed in the Biophysical
Chemistry department of University of Groningen, and is now maintained
by contributors in universities and research centers across the world.
GROMACS is one of the fastest and most popular software packages
available and can run on CPUs as well as GPUs. It is free, open source
released under the GNU General Public License. Starting from version
4.6, GROMACS is released under the GNU Lesser General Public License.
Homepage: http://www.gromacs.org
Maintainers: @junghans @marvinbernhardt
Externally Detectable:
False
Tags:
None
Preferred version:
2023.1 https://ftp.gromacs.org/gromacs/gromacs-2023.1.tar.gz
Safe versions:
master [git] https://github.com/gromacs/gromacs.git on branch m75d277138475679dd3e334e384a71516570cde767310476687f2a5b72333ea41aster
2023.1 https://ftp.gromacs.org/gromacs/gromacs-2023.1.tar.gz
2023 https://ftp.gromacs.org/gromacs/gromacs-2023.tar.gz
2022.6 https://ftp.gromacs.org/gromacs/gromacs-2022.6.tar.gz
2022.5 https://ftp.gromacs.org/gromacs/gromacs-2022.5.tar.gz
2022.4 https://ftp.gromacs.org/gromacs/gromacs-2022.4.tar.gz
2022.3 https://ftp.gromacs.org/gromacs/gromacs-2022.3.tar.gz
2022.2 https://ftp.gromacs.org/gromacs/gromacs-2022.2.tar.gz
2022.1 https://ftp.gromacs.org/gromacs/gromacs-2022.1.tar.gz
2022 https://ftp.gromacs.org/gromacs/gromacs-2022.tar.gz
2021.7 https://ftp.gromacs.org/gromacs/gromacs-2021.7.tar.gz
2021.6 https://ftp.gromacs.org/gromacs/gromacs-2021.6.tar.gz
2021.5 https://ftp.gromacs.org/gromacs/gromacs-2021.5.tar.gz
2021.4 https://ftp.gromacs.org/gromacs/gromacs-2021.4.tar.gz
2021.3 https://ftp.gromacs.org/gromacs/gromacs-2021.3.tar.gz
2021.2 https://ftp.gromacs.org/gromacs/gromacs-2021.2.tar.gz
2021.1 https://ftp.gromacs.org/gromacs/gromacs-2021.1.tar.gz
2021 https://ftp.gromacs.org/gromacs/gromacs-2021.tar.gz
2020.6 https://ftp.gromacs.org/gromacs/gromacs-2020.6.tar.gz
2020.5 https://ftp.gromacs.org/gromacs/gromacs-2020.5.tar.gz
2020.4 https://ftp.gromacs.org/gromacs/gromacs-2020.4.tar.gz
2020.3 https://ftp.gromacs.org/gromacs/gromacs-2020.3.tar.gz
2020.2 https://ftp.gromacs.org/gromacs/gromacs-2020.2.tar.gz
2020.1 https://ftp.gromacs.org/gromacs/gromacs-2020.1.tar.gz
2020 https://ftp.gromacs.org/gromacs/gromacs-2020.tar.gz
2019.6 https://ftp.gromacs.org/gromacs/gromacs-2019.6.tar.gz
2019.5 https://ftp.gromacs.org/gromacs/gromacs-2019.5.tar.gz
2019.4 https://ftp.gromacs.org/gromacs/gromacs-2019.4.tar.gz
2019.3 https://ftp.gromacs.org/gromacs/gromacs-2019.3.tar.gz
2019.2 https://ftp.gromacs.org/gromacs/gromacs-2019.2.tar.gz
2019.1 https://ftp.gromacs.org/gromacs/gromacs-2019.1.tar.gz
2019 https://ftp.gromacs.org/gromacs/gromacs-2019.tar.gz
2018.8 https://ftp.gromacs.org/gromacs/gromacs-2018.8.tar.gz
2018.5 https://ftp.gromacs.org/gromacs/gromacs-2018.5.tar.gz
2018.4 https://ftp.gromacs.org/gromacs/gromacs-2018.4.tar.gz
2018.3 https://ftp.gromacs.org/gromacs/gromacs-2018.3.tar.gz
2018.2 https://ftp.gromacs.org/gromacs/gromacs-2018.2.tar.gz
2018.1 https://ftp.gromacs.org/gromacs/gromacs-2018.1.tar.gz
2018 https://ftp.gromacs.org/gromacs/gromacs-2018.tar.gz
2016.6 https://ftp.gromacs.org/gromacs/gromacs-2016.6.tar.gz
2016.5 https://ftp.gromacs.org/gromacs/gromacs-2016.5.tar.gz
2016.4 https://ftp.gromacs.org/gromacs/gromacs-2016.4.tar.gz
2016.3 https://ftp.gromacs.org/gromacs/gromacs-2016.3.tar.gz
5.1.5 https://ftp.gromacs.org/gromacs/gromacs-5.1.5.tar.gz
5.1.4 https://ftp.gromacs.org/gromacs/gromacs-5.1.4.tar.gz
5.1.2 https://ftp.gromacs.org/gromacs/gromacs-5.1.2.tar.gz
4.6.7 https://ftp.gromacs.org/gromacs/gromacs-4.6.7.tar.gz
4.5.5 https://ftp.gromacs.org/gromacs/gromacs-4.5.5.tar.gz
Deprecated versions:
None
Variants:
Name [Default] When Allowed values Description
============================== ==== ==================== ==============================================================================
blas [off] -- on, off Enables an external BLAS library
build_type [RelWithDebInfo] [, ] Debug, Release, The build type to build
RelWithDebInfo,
MinSizeRel,
Reference,
RelWithAssert,
Profile
cuda [off] -- on, off Enable CUDA support
cycle_subcounters [off] -- on, off Enables cycle subcounters
double [off] -- on, off Produces a double precision version of the executables
hwloc [on] -- on, off Use the hwloc portable hardware locality library
ipo [off] -- on, off CMake interprocedural optimization
lapack [off] -- on, off Enables an external LAPACK library
mdrun_only [off] -- on, off Enables the build of a cut-down version of libgromacs and/or the mdrun program
mpi [on] -- on, off Activate MPI support (disable for Thread-MPI support)
nosuffix [off] -- on, off Disable default suffixes
opencl [off] -- on, off Enable OpenCL support
openmp [on] -- on, off Enables OpenMP at configure time
plumed [off] -- on, off Enable PLUMED support
relaxed_double_precision [off] -- on, off GMX_RELAXED_DOUBLE_PRECISION, use only for Fujitsu PRIMEHPC
shared [on] -- on, off Enables the build of shared libraries
sycl [off] -- on, off Enable SYCL support
Installation Phases:
cmake build install
Build Dependencies:
blas cmake cuda fftw-api hwloc lapack mpi plumed sycl
Link Dependencies:
blas cuda fftw-api hwloc lapack mpi plumed sycl
Run Dependencies:
None
Virtual Packages:
None
[mkandes@login01 gromacs]$
Testing spec for the latest version of 2022 series, which is 2022.6 --- the plumed variant has been disabled as this version is not supported; 2022.3 is not supported either.
[mkandes@login02 ~]$ spack spec -l gromacs@2022.6 % gcc@10.2.0 +blas +cuda ~cycle_subcounters ~double +hwloc ~ipo +lapack ~mdrun_only +mpi ~nosuffix ~opencl +openmp ~plumed ~relaxed_double_precision +shared ~sycl ^openblas@0.3.18/$(spack find --format '{hash:7}' openblas@0.3.18 % ${SPACK_COMPILER} ~ilp64 threads=none) ^fftw@3.3.10/$(spack find --format '{hash:7}' fftw@3.3.10 % ${SPACK_COMPILER} ~mpi ~openmp) ^openmpi@4.1.3/$(spack find --format '{hash:7}' openmpi@4.1.3 % ${SPACK_COMPILER})
Input spec
--------------------------------
gromacs@2022.6%gcc@10.2.0+blas+cuda~cycle_subcounters~double+hwloc~ipo+lapack~mdrun_only+mpi~nosuffix~opencl+openmp~plumed~relaxed_double_precision+shared~sycl
^fftw@3.3.10%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~mpi~openmp~pfft_patches precision=double,float arch=linux-rocky8-cascadelake
^openblas@0.3.18%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~bignuma~consistent_fpcsr~ilp64+locking+pic+shared threads=none arch=linux-rocky8-cascadelake
^openmpi@4.1.3%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~atomics+cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java+legacylaunchers+lustre~memchecker+pmi+pmix+romio~rsh~singularity+static+vt+wrapper-rpath cuda_arch=70,80 fabrics=ucx schedulers=slurm arch=linux-rocky8-cascadelake
^cuda@11.2.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~dev arch=linux-rocky8-cascadelake
^libxml2@2.9.12%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~python arch=linux-rocky8-cascadelake
^libiconv@1.16%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" libs=shared,static arch=linux-rocky8-cascadelake
^xz@5.2.5%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~pic libs=shared,static arch=linux-rocky8-cascadelake
^zlib@1.2.11%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" +optimize+pic+shared arch=linux-rocky8-cascadelake
^hwloc@2.6.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~cairo+cuda~gl~libudev+libxml2~netloc~nvml~opencl+pci~rocm+shared arch=linux-rocky8-cascadelake
^libpciaccess@0.16%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" arch=linux-rocky8-cascadelake
^ncurses@6.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~symlinks+termlib abi=none arch=linux-rocky8-cascadelake
^libevent@2.1.8%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~openssl arch=linux-rocky8-cascadelake
^lustre@2.15.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" arch=linux-rocky8-cascadelake
^numactl@2.0.14%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" patches=4e1d78cbbb85de625bad28705e748856033eaafab92a66dffd383a3d7e00cc94,62fc8a8bf7665a60e8f4c93ebbd535647cebf74198f7afafec4c085a8825c006,ff37630df599cfabf0740518b91ec8daaf18e8f288b19adaae5364dc1f6b2296 arch=linux-rocky8-cascadelake
^pmix@3.2.1%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~docs+pmi_backwards_compatibility~restful arch=linux-rocky8-cascadelake
^slurm@21.08.8%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~gtk~hdf5~hwloc~mariadb~pmix+readline~restd sysconfdir=PREFIX/etc arch=linux-rocky8-cascadelake
^ucx@1.10.1%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~assertions~cm+cma+cuda+dc~debug+dm+gdrcopy+ib-hw-tm~java~knem~logging+mlx5-dv+optimizations~parameter_checking+pic+rc~rocm+thread_multiple+ud~xpmem cuda_arch=70,80 arch=linux-rocky8-cascadelake
^gdrcopy@2.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" arch=linux-rocky8-cascadelake
^rdma-core@43.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~ipo build_type=RelWithDebInfo arch=linux-rocky8-cascadelake
Concretized
--------------------------------
r77p3et gromacs@2022.6%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" +blas+cuda~cycle_subcounters~double+hwloc~ipo+lapack~mdrun_only+mpi~nosuffix~opencl+openmp~plumed~relaxed_double_precision+shared~sycl build_type=RelWithDebInfo arch=linux-rocky8-cascadelake
2577x2s ^cmake@3.21.4%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~doc+ncurses~openssl+ownlibs~qt build_type=Release arch=linux-rocky8-cascadelake
5jrknc3 ^ncurses@6.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~symlinks+termlib abi=none arch=linux-rocky8-cascadelake
blza2ps ^cuda@11.2.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~dev arch=linux-rocky8-cascadelake
2q4yola ^libxml2@2.9.12%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~python arch=linux-rocky8-cascadelake
5a3xt3s ^libiconv@1.16%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" libs=shared,static arch=linux-rocky8-cascadelake
5xho2dj ^xz@5.2.5%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~pic libs=shared,static arch=linux-rocky8-cascadelake
2c5fvip ^zlib@1.2.11%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" +optimize+pic+shared arch=linux-rocky8-cascadelake
7ahyh5v ^fftw@3.3.10%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~mpi~openmp~pfft_patches precision=double,float arch=linux-rocky8-cascadelake
okiyq35 ^hwloc@2.6.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~cairo+cuda~gl~libudev+libxml2~netloc~nvml~opencl+pci~rocm+shared arch=linux-rocky8-cascadelake
x3u2rw4 ^libpciaccess@0.16%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" arch=linux-rocky8-cascadelake
lsmegf6 ^openblas@0.3.18%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~bignuma~consistent_fpcsr~ilp64+locking+pic+shared threads=none arch=linux-rocky8-cascadelake
gzzscfu ^openmpi@4.1.3%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~atomics+cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java+legacylaunchers+lustre~memchecker+pmi+pmix+romio~rsh~singularity+static+vt+wrapper-rpath cuda_arch=70,80 fabrics=ucx schedulers=slurm arch=linux-rocky8-cascadelake
ne2joyw ^libevent@2.1.8%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~openssl arch=linux-rocky8-cascadelake
73aggpy ^lustre@2.15.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" arch=linux-rocky8-cascadelake
34rinp4 ^numactl@2.0.14%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" patches=4e1d78cbbb85de625bad28705e748856033eaafab92a66dffd383a3d7e00cc94,62fc8a8bf7665a60e8f4c93ebbd535647cebf74198f7afafec4c085a8825c006,ff37630df599cfabf0740518b91ec8daaf18e8f288b19adaae5364dc1f6b2296 arch=linux-rocky8-cascadelake
43uenl2 ^pmix@3.2.1%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~docs+pmi_backwards_compatibility~restful arch=linux-rocky8-cascadelake
nflzb3l ^slurm@21.08.8%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~gtk~hdf5~hwloc~mariadb~pmix+readline~restd sysconfdir=PREFIX/etc arch=linux-rocky8-cascadelake
msro2p7 ^ucx@1.10.1%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~assertions~cm+cma+cuda+dc~debug+dm+gdrcopy+ib-hw-tm~java~knem~logging+mlx5-dv+optimizations~parameter_checking+pic+rc~rocm+thread_multiple+ud~xpmem cuda_arch=70,80 arch=linux-rocky8-cascadelake
nefrbxp ^gdrcopy@2.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" arch=linux-rocky8-cascadelake
k7hbyv2 ^rdma-core@43.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~ipo build_type=RelWithDebInfo arch=linux-rocky8-cascadelake
[mkandes@login02 ~]$
Running test build spec.
[mkandes@login01 specs]$ pwd
/home/mkandes/.spack/0.17.3/gpu/b/etc/spack/sdsc/expanse/0.17.3/gpu/b/specs
[mkandes@login01 specs]$ ls
amber@22.o24832236.exp-15-58 amber@22.o24832341.exp-15-58 amber@22.o24832541.exp-15-58 amber@22.o24837084.exp-15-58 AmberTools23.tar.bz2
amber@22.o24832242.exp-15-58 amber@22.o24832352.exp-15-58 amber@22.o24832674.exp-15-58 amber@22.sh gromacs@2022.6.sh
amber@22.o24832319.exp-15-58 amber@22.o24832435.exp-15-58 amber@22.o24832858.exp-15-58 Amber22.tar.bz2 spec.2461822
[mkandes@login01 specs]$ cat gromacs@2022.6.sh
#!/usr/bin/env bash
#SBATCH --job-name=gromacs@2022.6
#SBATCH --account=use300
#SBATCH --partition=ind-gpu-shared
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=1
#SBATCH --cpus-per-task=10
#SBSTCH --mem=93G
#SBATCH --gpus=1
#SBATCH --time=48:00:00
#SBATCH --output=%x.o%j.%N
declare -xr LOCAL_TIME="$(date +'%Y%m%dT%H%M%S%z')"
declare -xir UNIX_TIME="$(date +'%s')"
declare -xr SYSTEM_NAME='expanse'
declare -xr SPACK_VERSION='0.17.3'
declare -xr SPACK_INSTANCE_NAME='gpu'
declare -xr SPACK_INSTANCE_VERSION='b'
declare -xr SPACK_INSTANCE_DIR="/cm/shared/apps/spack/${SPACK_VERSION}/${SPACK_INSTANCE_NAME}/${SPACK_INSTANCE_VERSION}"
declare -xr SLURM_JOB_SCRIPT="$(scontrol show job ${SLURM_JOB_ID} | awk -F= '/Command=/{print $2}')"
declare -xr SLURM_JOB_MD5SUM="$(md5sum ${SLURM_JOB_SCRIPT})"
declare -xr SCHEDULER_MODULE='slurm'
declare -xr COMPILER_MODULE='gcc/10.2.0'
declare -xr MPI_MODULE='openmpi/4.1.3'
declare -xr CUDA_MODULE='cuda/11.2.2'
declare -xr CMAKE_MODULE='cmake/3.21.4'
echo "${UNIX_TIME} ${SLURM_JOB_ID} ${SLURM_JOB_MD5SUM} ${SLURM_JOB_DEPENDENCY}"
echo ""
cat "${SLURM_JOB_SCRIPT}"
module purge
module load "${SCHEDULER_MODULE}"
. "${SPACK_INSTANCE_DIR}/share/spack/setup-env.sh"
module use "${SPACK_ROOT}/share/spack/lmod/linux-rocky8-x86_64/Core"
module load "${COMPILER_MODULE}"
module load "${MPI_MODULE}"
module load "${CUDA_MODULE}"
module load "${CMAKE_MODULE}"
module list
shopt -s expand_aliases
source ~/.bashrc
declare -xr SPACK_PACKAGE='gromacs@2022.6'
declare -xr SPACK_COMPILER='gcc@10.2.0'
declare -xr SPACK_VARIANTS='+blas +cuda ~cycle_subcounters ~double +hwloc ~ipo +lapack ~mdrun_only +mpi ~nosuffix ~opencl +openmp ~plumed ~relaxed_double_precision +shared ~sycl'
declare -xr SPACK_DEPENDENCIES="^openblas@0.3.18/$(spack find --format '{hash:7}' openblas@0.3.18 % ${SPACK_COMPILER} ~ilp64 threads=none) ^fftw@3.3.10/$(spack find --format '{hash:7}' fftw@3.3.10 % ${SPACK_COMPILER} ~mpi ~openmp) ^openmpi@4.1.3/$(spack find --format '{hash:7}' openmpi@4.1.3 % ${SPACK_COMPILER})"
declare -xr SPACK_SPEC="${SPACK_PACKAGE} % ${SPACK_COMPILER} ${SPACK_VARIANTS} ${SPACK_DEPENDENCIES}"
printenv
spack config get compilers
spack config get config
spack config get mirrors
spack config get modules
spack config get packages
spack config get repos
spack config get upstreams
time -p spack spec --long --namespaces --types --reuse $(echo "${SPACK_SPEC}")
if [[ "${?}" -ne 0 ]]; then
echo 'ERROR: spack concretization failed.'
fi
time -p spack install -v --jobs "${SLURM_CPUS_PER_TASK}" --fail-fast --yes-to-all --reuse $(echo "${SPACK_SPEC}")
if [[ "${?}" -ne 0 ]]; then
echo 'ERROR: spack install failed.'
exit 1
fi
spack module lmod refresh --delete-tree -y
#sbatch --dependency="afterok:${SLURM_JOB_ID}" ''
sleep 30
[mkandes@login01 specs]$ sbatch gromacs@2022.6.sh
Submitted batch job 25087629
[mkandes@login01 specs]$
Build appears to have succeeded.
-rw-r--r-- 1 mkandes use300 9.6M Sep 6 20:25 gromacs@2022.6.o25087637.exp-2-57
[mkandes@login01 specs]$ less gromacs@2022.6.o25087637.exp-2-57
[mkandes@login01 specs]$ module avail
----------------------------------------------- /home/mkandes/.spack/0.17.3/gpu/b/share/spack/lmod/linux-rocky8-x86_64 -----------------------------------------------
openmpi/4.1.3-gzzscfu/gcc/10.2.0/gromacs/2022.6/r77p3et-omp
-------------------------------------------- /cm/shared/apps/spack/0.17.3/cpu/b/share/spack/lmod/linux-rocky8-x86_64/Core --------------------------------------------
anaconda3/2021.05/q4munrg gcc/10.2.0/npcyll4 intel/19.1.3.304/6pv46so pigz/2.6/bgymyil ucx/1.10.1/wla3unl
aocc/3.2.0/io3s466 gh/2.0.0/mkz3uxl matlab/2022b/lefe4oq rclone/1.56.2/mldjorr
aria2/1.35.0/q32jtg2 git-lfs/2.11.0/kmruniy mercurial/5.8/qmgrjvl sratoolkit/2.10.9/rn4humf
entrezdirect/10.7.20190114/6pkkpx2 git/2.31.1/ldetm5y parallel/20210922/sqru6rr subversion/1.14.0/qpzq6zs
----------------------------------------------------------------------- /cm/local/modulefiles ------------------------------------------------------------------------
cmjob cuda-dcgm/3.1.3.1 docker/20.10.21 lua/5.4.4 openmpi/mlnx/gcc/64/4.1.5a1 shared (L) singularitypro/3.9 slurm/expanse/21.08.8 (L)
----------------------------------------------------------------- /cm/shared/apps/access/modulefiles -----------------------------------------------------------------
accessusage/0.5-1 cue-login-env
----------------------------------------------------------------------- /usr/share/modulefiles -----------------------------------------------------------------------
DefaultModules (L) cpu/0.17.1 (c) cpu/0.17.3b (c,L,D) gpu/0.17.1 (g) gpu/0.17.3b (g,D) nostack/0.17.1 (e) nostack/0.17.3b (e,D)
cpu/0.15.4 (c) cpu/0.17.3a (c) gpu/0.15.4 (g) gpu/0.17.3a (g) nostack/0.15.4 (e) nostack/0.17.3a (e)
----------------------------------------------------------------------- /cm/shared/modulefiles -----------------------------------------------------------------------
AMDuProf/3.4.475 default-environment sdsc/1.0 (L) slurm/expanse/current slurm/expanse/21.08.8 (D)
Where:
L: Module is loaded
c: built natively for AMD Rome
e: not architecture specific
g: built natively for Intel Skylake
D: Default Module
Module defaults are chosen based on Find First Rules due to Name/Version/Version modules found in the module tree.
See https://lmod.readthedocs.io/en/latest/060_locating.html for details.
Use "module spider" to find all possible modules and extensions.
Use "module keyword key1 key2 ..." to search for all possible modules matching any of the "keys".
[mkandes@login01 specs]$
mkandes@login01 specs]$ tail -n 20 gromacs@2022.6.o25087637.exp-2-57
==> [2023-09-06-20:22:49.404629] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/gromacs_mpi-config-version.cmake [replacing "\/cm\/shared\/apps\/spack\/0\.17\.3\/gpu\/b\/lib\/spack\/env\/gcc\/gfortran"]
==> [2023-09-06-20:22:49.499700] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/libgromacs-relwithdebinfo.cmake [replacing "\/cm\/shared\/apps\/spack\/0\.17\.3\/gpu\/b\/lib\/spack\/env\/gcc\/gfortran"]
==> [2023-09-06-20:22:49.588055] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/gromacs_mpi-config.cmake [replacing "\/cm\/shared\/apps\/spack\/0\.17\.3\/gpu\/b\/lib\/spack\/env\/gcc\/gfortran"]
==> [2023-09-06-20:22:49.688068] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/gromacs-hints_mpi.cmake [replacing "\/cm\/shared\/apps\/spack\/0\.17\.3\/gpu\/b\/lib\/spack\/env\/cc"]
==> [2023-09-06-20:22:49.768033] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/libgromacs.cmake [replacing "\/cm\/shared\/apps\/spack\/0\.17\.3\/gpu\/b\/lib\/spack\/env\/cc"]
==> [2023-09-06-20:22:49.833035] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/gromacs_mpi-config-version.cmake [replacing "\/cm\/shared\/apps\/spack\/0\.17\.3\/gpu\/b\/lib\/spack\/env\/cc"]
==> [2023-09-06-20:22:49.902619] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/libgromacs-relwithdebinfo.cmake [replacing "\/cm\/shared\/apps\/spack\/0\.17\.3\/gpu\/b\/lib\/spack\/env\/cc"]
==> [2023-09-06-20:22:50.012587] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/gromacs_mpi-config.cmake [replacing "\/cm\/shared\/apps\/spack\/0\.17\.3\/gpu\/b\/lib\/spack\/env\/cc"]
==> [2023-09-06-20:22:50.110378] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/gromacs-hints_mpi.cmake [replacing "\-Wl\,\-\-enable\-new\-dtags"]
==> [2023-09-06-20:22:50.184303] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/libgromacs.cmake [replacing "\-Wl\,\-\-enable\-new\-dtags"]
==> [2023-09-06-20:22:50.252984] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/gromacs_mpi-config-version.cmake [replacing "\-Wl\,\-\-enable\-new\-dtags"]
==> [2023-09-06-20:22:50.315711] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/libgromacs-relwithdebinfo.cmake [replacing "\-Wl\,\-\-enable\-new\-dtags"]
==> [2023-09-06-20:22:50.361399] FILTER FILE: /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg/share/cmake/gromacs_mpi/gromacs_mpi-config.cmake [replacing "\-Wl\,\-\-enable\-new\-dtags"]
==> gromacs: Successfully installed gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg
Fetch: 8.07s. Build: 22m 24.08s. Total: 22m 32.15s.
[+] /home/mkandes/.spack/0.17.3/gpu/b/opt/spack/linux-rocky8-cascadelake/gcc-10.2.0/gromacs-2022.6-r77p3etedasdtuaqdutybbla6imzjoyg
real 1490.17
user 2232.72
sys 3317.83
==> Regenerating lmod module files
[mkandes@login01 specs]$
I will test build tomorrow.
Build tested successfully with standard water benchmark.
[mkandes@login01 gromacs]$ pwd
/home/mkandes/benchmarks/gromacs
[mkandes@login01 gromacs]$ grep Performance: gromacs-2022.6-r77p3et*
gromacs-2022.6-r77p3et-gcc-10.2.0-i62tgso-openmpi-4.1.3-gzzscfu-water-cut1.0_GMX50_bare-3072-1-node-10-mpi-1-omp-1v100.o25091837.0.exp-7-59:Performance: 3.527 6.804
gromacs-2022.6-r77p3et-gcc-10.2.0-i62tgso-openmpi-4.1.3-gzzscfu-water-cut1.0_GMX50_bare-3072-1-node-1-mpi-10-omp-1v100.o25092621.0.exp-7-59:Performance: 3.358 7.146
gromacs-2022.6-r77p3et-gcc-10.2.0-i62tgso-openmpi-4.1.3-gzzscfu-water-cut1.0_GMX50_bare-3072-1-node-20-mpi-2-omp-4v100.o25091669.0.exp-7-59:Performance: 7.156 3.354
gromacs-2022.6-r77p3et-gcc-10.2.0-i62tgso-openmpi-4.1.3-gzzscfu-water-cut1.0_GMX50_bare-3072-1-node-2-mpi-5-omp-1v100.o25092206.0.exp-7-59:Performance: 2.876 8.346
gromacs-2022.6-r77p3et-gcc-10.2.0-i62tgso-openmpi-4.1.3-gzzscfu-water-cut1.0_GMX50_bare-3072-1-node-40-mpi-1-omp-4v100.o25091734.0.exp-7-59:Performance: 6.787 3.536
gromacs-2022.6-r77p3et-gcc-10.2.0-i62tgso-openmpi-4.1.3-gzzscfu-water-cut1.0_GMX50_bare-3072-1-node-5-mpi-2-omp-1v100.o25092021.0.exp-7-59:Performance: 3.242 7.404
gromacs-2022.6-r77p3et-gcc-10.2.0-i62tgso-openmpi-4.1.3-gzzscfu-water-cut1.0_GMX50_bare-3072-1-node-8-mpi-5-omp-4v100.o25091599.0.exp-7-59:Performance: 5.906 4.063
[mkandes@login01 gromacs]$
Resync local fork with sdsc-0.17.3
upstream.
[mkandes@login01 packages]$ pwd
/home/mkandes/software/spack/repos/mkandes/var/spack/repos/sdsc/packages
[mkandes@login01 packages]$ git log
commit 76f12648347505b4768cffae4ae951fbd36b7b75 (HEAD -> sdsc-0.17.3, origin/sdsc-0.17.3, origin/HEAD)
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:00:01 2023 -0700
Add elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
commit 494c75a06253920a9e9c61f2bfc5f5ee6f0a1fae (upstream/sdsc-0.17.3)
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Fri Aug 25 14:36:18 2023 -0700
Remove nwchem@7.0.2 % aocc@3.2.0 ^openmpi@4.1.3 from exp/0.17.3/cpu/b
We've continued to run into a runtime communication-related problem with
our AOCC-based builds [1]. As such, we've decided to remove them from
the aocc@3.2.0 ^openmpi@4.1.3 dependency chain in expanse/0.17.3/cpu/b
for the time being. We will revisit in the future.
[1] https://github.com/sdsc/spack/issues/62
commit 29e2b21be9f48013768480725549df8ee1ee36ce
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Fri Aug 25 06:54:26 2023 -0700
[mkandes@login01 packages]$ git branch
* sdsc-0.17.3
sdsc-0.17.3-gh-33-spec-abinit
sdsc-0.17.3-gh-46-pkg-spec-wannier90
sdsc-0.17.3-gh-86-pkg-spec-pyscf
[mkandes@login01 packages]$ git fetch origin
[mkandes@login01 packages]$ git upstreams
git: 'upstreams' is not a git command. See 'git --help'.
[mkandes@login01 packages]$ git log
commit 76f12648347505b4768cffae4ae951fbd36b7b75 (HEAD -> sdsc-0.17.3, origin/sdsc-0.17.3, origin/HEAD)
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:00:01 2023 -0700
Add elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
commit 494c75a06253920a9e9c61f2bfc5f5ee6f0a1fae (upstream/sdsc-0.17.3)
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Fri Aug 25 14:36:18 2023 -0700
Remove nwchem@7.0.2 % aocc@3.2.0 ^openmpi@4.1.3 from exp/0.17.3/cpu/b
We've continued to run into a runtime communication-related problem with
our AOCC-based builds [1]. As such, we've decided to remove them from
the aocc@3.2.0 ^openmpi@4.1.3 dependency chain in expanse/0.17.3/cpu/b
for the time being. We will revisit in the future.
[1] https://github.com/sdsc/spack/issues/62
commit 29e2b21be9f48013768480725549df8ee1ee36ce
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Fri Aug 25 06:54:26 2023 -0700
[mkandes@login01 packages]$ git fetch --tags upstream
remote: Enumerating objects: 16, done.
remote: Counting objects: 100% (8/8), done.
remote: Total 16 (delta 8), reused 8 (delta 8), pack-reused 8
Unpacking objects: 100% (16/16), 6.54 KiB | 4.00 KiB/s, done.
From https://github.com/sdsc/spack
494c75a062..75dc94e08f sdsc-0.17.3 -> upstream/sdsc-0.17.3
[mkandes@login01 packages]$ git merge upstream/sdsc-0.17.3
Updating 76f1264834..75dc94e08f
Fast-forward
.../elpa@2021.05.001.o24925383.exp-15-56 | 704 ++++++++++++++++++++
.../elpa@2021.05.001.o24925338.exp-15-56 | 740 +++++++++++++++++++++
2 files changed, 1444 insertions(+)
create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/intel-mpi@2019.10.317/elpa@2021.05.001.o24925383.exp-15-56
create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/elpa@2021.05.001.o24925338.exp-15-56
[mkandes@login01 packages]$ git log
commit 75dc94e08f95d22e3a9414410de016f2dc6295d9 (HEAD -> sdsc-0.17.3, upstream/sdsc-0.17.3)
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:24:16 2023 -0700
Deploy elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
commit cc47ce9571fa1b5dabc73765bd98a41678deedc4
Merge: 494c75a062 76f1264834
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:04:24 2023 -0700
Merge pull request #99 from mkandes/sdsc-0.17.3
Add elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
commit 76f12648347505b4768cffae4ae951fbd36b7b75 (origin/sdsc-0.17.3, origin/HEAD)
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:00:01 2023 -0700
Add elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
[mkandes@login01 packages]$ git push
Username for 'https://github.com': mkandes
Password for 'https://mkandes@github.com':
Enumerating objects: 40, done.
Counting objects: 100% (40/40), done.
Delta compression using up to 64 threads
Compressing objects: 100% (12/12), done.
Writing objects: 100% (16/16), 12.87 KiB | 6.44 MiB/s, done.
Total 16 (delta 7), reused 0 (delta 0), pack-reused 0
remote: Resolving deltas: 100% (7/7), completed with 6 local objects.
To https://github.com/mkandes/spack.git
76f1264834..75dc94e08f sdsc-0.17.3 -> sdsc-0.17.3
[mkandes@login01 packages]$
Stage changes for commit.
[mkandes@login01 packages]$ cp ~/.spack/0.17.3/gpu/b/
config.yaml opt/ share/ var/
etc/ repos.yaml upstreams.yaml
[mkandes@login01 packages]$ cp ~/.spack/0.17.3/gpu/b/var/spack/
cache/ repos/ stage/
[mkandes@login01 packages]$ cp -rp ~/.spack/0.17.3/gpu/b/var/spack/repos/mkandes/packages/gromacs/ ./
[mkandes@login01 packages]$ git add gromacs/
[mkandes@login01 packages]$ cd ~/software/spack/repos/mkandes/etc/spack/
defaults/ licenses/ sdsc/
[mkandes@login01 packages]$ cd ~/software/spack/repos/mkandes/etc/spack/sdsc/expanse/0.17.3/gpu/b/
[mkandes@login01 b]$ cd specs/
[mkandes@login01 specs]$ cd gcc@10.2.0/openmpi@4.1.3/
[mkandes@login01 openmpi@4.1.3]$ pwd
/home/mkandes/software/spack/repos/mkandes/etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/openmpi@4.1.3
[mkandes@login01 openmpi@4.1.3]$ cp -rp ~/.spack/0.17.3/gpu/b/etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/
cp: missing destination file operand after '/home/mkandes/.spack/0.17.3/gpu/b/etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/'
Try 'cp --help' for more information.
[mkandes@login01 openmpi@4.1.3]$ cp -rp ~/.spack/0.17.3/gpu/b/etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gromacs@2022.6.sh ./
[mkandes@login01 openmpi@4.1.3]$ vi gromacs@2022.6.sh
[mkandes@login01 openmpi@4.1.3]$ vi gromacs@2020.4.sh
[mkandes@login01 openmpi@4.1.3]$ vi gromacs@2022.6.sh
[mkandes@login01 openmpi@4.1.3]$ git add gromacs@2022.6.sh
[mkandes@login01 openmpi@4.1.3]$ git add gromacs@2020.4.sh
[mkandes@login01 openmpi@4.1.3]$ cd ../../../../
[mkandes@login01 gpu]$ ls
b
[mkandes@login01 gpu]$ cd ../
[mkandes@login01 0.17.3]$ cd ../
[mkandes@login01 expanse]$ ls
0.17.3
[mkandes@login01 expanse]$ cd 0.17.3/gpu/b/
[mkandes@login01 b]$ ls
specs SPECS.md yamls
[mkandes@login01 b]$ git status
On branch sdsc-0.17.3
Your branch is up to date with 'origin/sdsc-0.17.3'.
Changes to be committed:
(use "git restore --staged <file>..." to unstage)
modified: specs/gcc@10.2.0/openmpi@4.1.3/gromacs@2020.4.sh
new file: specs/gcc@10.2.0/openmpi@4.1.3/gromacs@2022.6.sh
new file: ../../../../../../../var/spack/repos/sdsc/packages/gromacs/gmxDetectCpu-cmake-3.14.patch
new file: ../../../../../../../var/spack/repos/sdsc/packages/gromacs/gmxDetectSimd-cmake-3.14.patch
new file: ../../../../../../../var/spack/repos/sdsc/packages/gromacs/package.py
[mkandes@login01 b]$
I again forgot to first create a pkg/spec branch off of the main deployment branch.
[mkandes@login01 b]$ git commit
[sdsc-0.17.3 c28dc2338b] Add gromacs@2022.6 % gcc@10.2.0 ^openmpi@4.1.3 to expanse/0.17.3/gpu/b
5 files changed, 535 insertions(+)
create mode 100644 etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/openmpi@4.1.3/gromacs@2022.6.sh
create mode 100644 var/spack/repos/sdsc/packages/gromacs/gmxDetectCpu-cmake-3.14.patch
create mode 100644 var/spack/repos/sdsc/packages/gromacs/gmxDetectSimd-cmake-3.14.patch
create mode 100644 var/spack/repos/sdsc/packages/gromacs/package.py
[mkandes@login01 b]$ git log
commit c28dc2338b80582c191157e7b310f33abe135072 (HEAD -> sdsc-0.17.3)
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Thu Sep 7 12:02:21 2023 -0700
Add gromacs@2022.6 % gcc@10.2.0 ^openmpi@4.1.3 to expanse/0.17.3/gpu/b
A user requested an updated version of GROMACS for their work [1][2][3].
We decided it may be useful to deploy one for all users on the system in
the expanse/0.17.3/gpu/b production instance. At the time of this commit,
GROMACS v2022.6 was the latest release from the 2022.x version series,
which was released on Tuesday, July 11th, 2023 [4].
The build was tested successfully via my shared Spack instance
configuration for expanse/0.17.3/gpu/b. And the standard water benchmark
performance was observed within expected ns/day ranges. We will likely
deploy the same version in expanse/0.17.3/cpu/b as well to mirror the
instance configruations and options as closely as possible.
[1] https://github.com/sdsc/spack/issues/100
[2] https://sdsc.zendesk.com/agent/tickets/29944
[3] https://sdsc.zendesk.com/agent/tickets/26289
[4] https://manual.gromacs.org
commit 75dc94e08f95d22e3a9414410de016f2dc6295d9 (upstream/sdsc-0.17.3, origin/sdsc-0.17.3, origin/HEAD)
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:24:16 2023 -0700
Deploy elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
commit cc47ce9571fa1b5dabc73765bd98a41678deedc4
Merge: 494c75a062 76f1264834
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:04:24 2023 -0700
Merge pull request #99 from mkandes/sdsc-0.17.3
Add elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
commit 76f12648347505b4768cffae4ae951fbd36b7b75
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:00:01 2023 -0700
Add elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
commit 494c75a06253920a9e9c61f2bfc5f5ee6f0a1fae
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Fri Aug 25 14:36:18 2023 -0700
[mkandes@login01 b]$ git branch
* sdsc-0.17.3
sdsc-0.17.3-gh-33-spec-abinit
sdsc-0.17.3-gh-46-pkg-spec-wannier90
sdsc-0.17.3-gh-86-pkg-spec-pyscf
[mkandes@login01 b]$
Pull request created and merged with sdsc/spack
upstream's sdsc-0.17.3
deployment branch. https://github.com/sdsc/spack/pull/101
Resynced expanse/0.17.3/gpu/b
production instance with sdsc/spack
spack-0.17.3 deployment branch.
[spack_gpu@login02 ~]$ srun --partition=ind-gpu-shared --reservation=root_73 --account=use300 --nodes=1 --ntasks-per-node=1 --cpus-per-task=10 --mem=93G --gpus=1 --time=12:00:00 --pty --wait=0 /bin/bash
[spack_gpu@exp-15-57 ~]$ cd /cm/shared/apps/spack/0.17.3/gpu/b/
[spack_gpu@exp-15-57 b]$ git log
commit 67e74f0e081b45fa82d7ff097a03e4cd5b3757ea (HEAD -> sdsc-0.17.3, origin/sdsc-0.17.3, origin/HEAD)
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Fri Jul 14 18:23:11 2023 -0700
Update modules.yaml for expanse/0.17.3/cpu/b to whitelist hadoop
commit bd28762c3a74ea7b511b6c4abbedb3ab50b6ec48
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Tue Jul 11 16:56:09 2023 -0700
Deploy spark@3.4.0 % gcc@10.2.0 into prod within expanse/0.17.3/cpu/b
commit 553657e0d4e92fb543587583665818cef36d4b94
Merge: e9bb2189ca 7e3d045306
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Mon Jul 10 17:15:28 2023 -0700
Merge pull request #90 from mkandes/sdsc-0.17.3
Add custom spark package to sdsc package repo in sdsc-0.17.3
commit 7e3d0453067e6d31c67922c47683148a9c5192e6
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Mon Jul 10 17:11:31 2023 -0700
Add custom spark package to sdsc package repo in sdsc-0.17.3
The custom change here is simply to include the latest version of
Spark at the time of this writing, which is Spark v3.4.0, and the sha256
hash associated with its downloadable tarball from the Spark project.
commit e9bb2189ca01894c673113228dd061cc046bc948
Merge: c33518253b 90f124605b
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Mon Jul 10 16:26:01 2023 -0700
Merge pull request #89 from mkandes/sdsc-0.17.3
Add spark@3.4.0 % gcc@10.2.0 to expanse/0.17.3/cpu/b
commit 90f124605b39fb79339dd2efe492ec1437b20b79
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Mon Jul 10 16:18:34 2023 -0700
Add spark@3.4.0 % gcc@10.2.0 to expanse/0.17.3/cpu/b
commit c33518253b5cd6096e34c63c6fea4a7cd6618c7b
[spack_gpu@exp-15-57 b]$ git status
On branch sdsc-0.17.3
Your branch is up to date with 'origin/sdsc-0.17.3'.
Changes to be committed:
(use "git restore --staged <file>..." to unstage)
deleted: var/spack/repos/sdsc/packages/amber/aarch64.patch
deleted: var/spack/repos/sdsc/packages/amber/nvhpc-boost.patch
deleted: var/spack/repos/sdsc/packages/amber/nvhpc.patch
deleted: var/spack/repos/sdsc/packages/amber/package.py
deleted: var/spack/repos/sdsc/packages/amber/ppc64le.patch
Changes not staged for commit:
(use "git add/rm <file>..." to update what will be committed)
(use "git restore <file>..." to discard changes in working directory)
modified: etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/cuda@10.2.89.sh
deleted: etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/openmpi@4.1.3/amber@22.sh
Untracked files:
(use "git add <file>..." to include in what will be committed)
etc/spack/compilers.yaml
etc/spack/licenses/intel/
etc/spack/modules.yaml
etc/spack/packages.yaml
etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/cuda@10.2.89.o24865655.exp-15-57
etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/intel-mpi@2019.10.317/amber@22.sh
var/spack/repos/sdsc/packages/amber/
[spack_gpu@exp-15-57 b]$ git stash
Saved working directory and index state WIP on sdsc-0.17.3: 67e74f0e08 Update modules.yaml for expanse/0.17.3/cpu/b to whitelist hadoop
[spack_gpu@exp-15-57 b]$ git pull
remote: Enumerating objects: 176, done.
remote: Counting objects: 100% (90/90), done.
remote: Compressing objects: 100% (9/9), done.
remote: Total 176 (delta 76), reused 89 (delta 76), pack-reused 86
Receiving objects: 100% (176/176), 51.21 KiB | 440.00 KiB/s, done.
Resolving deltas: 100% (76/76), completed with 15 local objects.
From https://github.com/sdsc/spack
67e74f0e08..eacf6bf7ce sdsc-0.17.3 -> origin/sdsc-0.17.3
Updating 67e74f0e08..eacf6bf7ce
Fast-forward
etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/aocc@3.2.0/openmpi@4.1.3/lammps@20210310.sh | 4 +-
etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/aocc@3.2.0/openmpi@4.1.3/nwchem@7.0.2.o24830973.exp-15-56 | 782 +++++++++++++++++++++++++++++++++++++++++++++++++++++
etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/aocc@3.2.0/openmpi@4.1.3/nwchem@7.0.2.sh | 8 +-
.../expanse/0.17.3/cpu/b/specs/gcc@10.2.0/intel-mpi@2019.10.317/elpa@2021.05.001.o24925383.exp-15-56 | 704 +++++++++++++++++++++++++++++++++++++++++++++++
etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/intel-mpi@2019.10.317/elpa@2021.05.001.sh | 76 ++++++
etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/abinit@9.4.2.o24818063.exp-15-56 | 761 +++++++++++++++++++++++++++++++++++++++++++++++++++
etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/abinit@9.4.2.sh | 76 ++++++
etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/elpa@2021.05.001.o24925338.exp-15-56 | 740 ++++++++++++++++++++++++++++++++++++++++++++++++++
etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/elpa@2021.05.001.sh | 14 +-
etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/wannier90@3.1.0.o24815821.exp-15-56 | 690 ++++++++++++++++++++++++++++++++++++++++++++++
etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/wannier90@3.1.0.sh | 76 ++++++
etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/py-pyscf@2.2.0.o24825642.exp-15-56 | 750 ++++++++++++++++++++++++++++++++++++++++++++++++++
etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/py-pyscf@2.2.0.sh | 76 ++++++
etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/openmpi@4.1.3/gromacs@2020.4.sh | 1 +
etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/openmpi@4.1.3/gromacs@2022.6.sh | 81 ++++++
var/spack/repos/sdsc/packages/gromacs/gmxDetectCpu-cmake-3.14.patch | 12 +
var/spack/repos/sdsc/packages/gromacs/gmxDetectSimd-cmake-3.14.patch | 11 +
var/spack/repos/sdsc/packages/gromacs/package.py | 430 +++++++++++++++++++++++++++++
var/spack/repos/sdsc/packages/py-pyscf/package.py | 58 ++++
var/spack/repos/sdsc/packages/wannier90/make.sys | 7 +
var/spack/repos/sdsc/packages/wannier90/package.py | 199 ++++++++++++++
21 files changed, 5544 insertions(+), 12 deletions(-)
create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/aocc@3.2.0/openmpi@4.1.3/nwchem@7.0.2.o24830973.exp-15-56
create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/intel-mpi@2019.10.317/elpa@2021.05.001.o24925383.exp-15-56
create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/intel-mpi@2019.10.317/elpa@2021.05.001.sh
create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/abinit@9.4.2.o24818063.exp-15-56
create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/abinit@9.4.2.sh
create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/elpa@2021.05.001.o24925338.exp-15-56
create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/wannier90@3.1.0.o24815821.exp-15-56
create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/wannier90@3.1.0.sh
create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/py-pyscf@2.2.0.o24825642.exp-15-56
create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/py-pyscf@2.2.0.sh
create mode 100644 etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/openmpi@4.1.3/gromacs@2022.6.sh
create mode 100644 var/spack/repos/sdsc/packages/gromacs/gmxDetectCpu-cmake-3.14.patch
create mode 100644 var/spack/repos/sdsc/packages/gromacs/gmxDetectSimd-cmake-3.14.patch
create mode 100644 var/spack/repos/sdsc/packages/gromacs/package.py
create mode 100644 var/spack/repos/sdsc/packages/py-pyscf/package.py
create mode 100644 var/spack/repos/sdsc/packages/wannier90/make.sys
create mode 100644 var/spack/repos/sdsc/packages/wannier90/package.py
[spack_gpu@exp-15-57 b]$ git stash pop
On branch sdsc-0.17.3
Your branch is up to date with 'origin/sdsc-0.17.3'.
Changes not staged for commit:
(use "git add/rm <file>..." to update what will be committed)
(use "git restore <file>..." to discard changes in working directory)
modified: etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/cuda@10.2.89.sh
deleted: etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/openmpi@4.1.3/amber@22.sh
modified: var/spack/repos/sdsc/packages/amber/package.py
Untracked files:
(use "git add <file>..." to include in what will be committed)
etc/spack/compilers.yaml
etc/spack/licenses/intel/
etc/spack/modules.yaml
etc/spack/packages.yaml
etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/cuda@10.2.89.o24865655.exp-15-57
etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/intel-mpi@2019.10.317/amber@22.sh
no changes added to commit (use "git add" and/or "git commit -a")
Dropped refs/stash@{0} (2090ba0bdaf3abd62831a102b15837a1be8e46bb)
[spack_gpu@exp-15-57 b]$ git log
commit eacf6bf7ce9e6d34b214f8b91bcee94e2b53d784 (HEAD -> sdsc-0.17.3, origin/sdsc-0.17.3, origin/HEAD)
Merge: 75dc94e08f c28dc2338b
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Thu Sep 7 12:15:17 2023 -0700
Merge pull request #101 from mkandes/sdsc-0.17.3
Add gromacs@2022.6 % gcc@10.2.0 ^openmpi@4.1.3 to expanse/0.17.3/gpu/b
commit c28dc2338b80582c191157e7b310f33abe135072
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Thu Sep 7 12:02:21 2023 -0700
Add gromacs@2022.6 % gcc@10.2.0 ^openmpi@4.1.3 to expanse/0.17.3/gpu/b
A user requested an updated version of GROMACS for their work [1][2][3].
We decided it may be useful to deploy one for all users on the system in
the expanse/0.17.3/gpu/b production instance. At the time of this commit,
GROMACS v2022.6 was the latest release from the 2022.x version series,
which was released on Tuesday, July 11th, 2023 [4].
The build was tested successfully via my shared Spack instance
configuration for expanse/0.17.3/gpu/b. And the standard water benchmark
performance was observed within expected ns/day ranges. We will likely
deploy the same version in expanse/0.17.3/cpu/b as well to mirror the
instance configruations and options as closely as possible.
[1] https://github.com/sdsc/spack/issues/100
[2] https://sdsc.zendesk.com/agent/tickets/29944
[3] https://sdsc.zendesk.com/agent/tickets/26289
[4] https://manual.gromacs.org
commit 75dc94e08f95d22e3a9414410de016f2dc6295d9
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:24:16 2023 -0700
Deploy elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
commit cc47ce9571fa1b5dabc73765bd98a41678deedc4
Merge: 494c75a062 76f1264834
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:04:24 2023 -0700
Merge pull request #99 from mkandes/sdsc-0.17.3
Add elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
[spack_gpu@exp-15-57 b]$
Running spec build for gromacs@2022.6.
[spack_gpu@exp-15-57 b]$ cd etc/spack/sdsc/expanse/0.17.3/gpu/b/specs/gcc@10.2.0/openmpi@4.1.3/
[spack_gpu@exp-15-57 openmpi@4.1.3]$ sbatch gromacs@2022.6.sh
Submitted batch job 25094304
[spack_gpu@exp-15-57 openmpi@4.1.3]$ squeue -u $USER
JOBID PARTITION NAME USER ST TIME NODES NODELIST(REASON)
25094215 ind-gpu-s bash spack_gp R 8:52 1 exp-15-57
25094304 ind-gpu-s gromacs@ spack_gp R 0:03 1 exp-15-57
[spack_gpu@exp-15-57 openmpi@4.1.3]$
Build completed successfully. Asked user to also test the build for their work.
[mkandes@login01 ~]$ module load gpu/0.17.3b
[mkandes@login01 ~]$ module load gcc/10.2.0
[mkandes@login01 ~]$ module load openmpi/4.1.3
[mkandes@login01 ~]$ module load gromacs/2022.6
[mkandes@login01 ~]$ module list
Currently Loaded Modules:
1) shared 7) cuda/11.2.2/blza2ps
2) slurm/expanse/21.08.8 8) ucx/1.10.1/msro2p7
3) sdsc/1.0 9) openmpi/4.1.3/gzzscfu
4) DefaultModules 10) fftw/3.3.10/7ahyh5v
5) gpu/0.17.3b (g) 11) openblas/0.3.18/lsmegf6
6) gcc/10.2.0/i62tgso 12) gromacs/2022.6/rspmhnj-omp
Where:
g: built natively for Intel Skylake
[mkandes@login01 ~]$
Deployment complete. https://github.com/sdsc/spack/commit/33591e7f421c4f95ee8ba4e50e8ab6dcd5beb835
Waiting for feedback from user that made request for update.
User confirmed gromacs@2022.6 will work for them. As such, we will go ahead and deploy the same version to expanse/0.17.3/cpu/b. To begin this deployment, we resync the expanse/0.17.3/cpu/b instance with the sdsc/spack repo's sdsc-0.17.3 deployment branch, which has undergone significant changes since merging the tscc/0.17.3/cpu and tscc/0.17.3/gpu instances from TSCC2.
[spack_cpu@login02 ~]$ !6235
srun --partition=ind-shared --reservation=root_73 --account=use300 --nodes=1 --nodelist=exp-15-56 --ntasks-per-node=1 --cpus-per-task=16 --mem=32G --time=12:00:00 --pty --wait=0 /bin/bash
[spack_cpu@exp-15-56 ~]$ cd /cm/shared/apps/spack/0.17.3/cpu/b/
[spack_cpu@exp-15-56 b]$ ls
bin DEPLOYMENT.md LICENSE-MIT pytest.ini var
CHANGELOG.md etc NOTICE README.md
CONTRIBUTING.md lib opt SECURITY.md
COPYRIGHT LICENSE-APACHE pyproject.toml share
[spack_cpu@exp-15-56 b]$ git log
commit 75dc94e08f95d22e3a9414410de016f2dc6295d9 (HEAD -> sdsc-0.17.3, origin/sdsc-0.17.3, origin/HEAD)
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:24:16 2023 -0700
Deploy elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
commit cc47ce9571fa1b5dabc73765bd98a41678deedc4
Merge: 494c75a062 76f1264834
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:04:24 2023 -0700
Merge pull request #99 from mkandes/sdsc-0.17.3
Add elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
commit 76f12648347505b4768cffae4ae951fbd36b7b75
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Wed Aug 30 12:00:01 2023 -0700
Add elpa@2021.05.001 % gcc@10.2.0 ^openmpi@4.1.3 + ^intel-mpi@2019.10.317
commit 494c75a06253920a9e9c61f2bfc5f5ee6f0a1fae
[spack_cpu@exp-15-56 b]$ git status
On branch sdsc-0.17.3
Your branch is up to date with 'origin/sdsc-0.17.3'.
Untracked files:
(use "git add <file>..." to include in what will be committed)
etc/spack/compilers.yaml
etc/spack/licenses/aocc/
etc/spack/licenses/intel/
etc/spack/modules.yaml
etc/spack/packages.yaml
nothing added to commit but untracked files present (use "git add" to track)
[spack_cpu@exp-15-56 b]$ git stash
No local changes to save
[spack_cpu@exp-15-56 b]$ git pull
remote: Enumerating objects: 58666, done.
remote: Counting objects: 100% (15222/15222), done.
remote: Compressing objects: 100% (36/36), done.
remote: Total 58666 (delta 15181), reused 15218 (delta 15180), pack-reused 43444
Receiving objects: 100% (58666/58666), 74.73 MiB | 16.57 MiB/s, done.
Resolving deltas: 100% (24343/24343), completed with 2121 local objects.
From https://github.com/sdsc/spack
+ 75dc94e08f...c1ff2f6acc sdsc-0.17.3 -> origin/sdsc-0.17.3 (forced update)
18c21d0c32..eee8fdc438 develop -> origin/develop
* [new tag] 2021.08.19 -> 2021.08.19
* [new tag] llnl-tce-craype -> llnl-tce-craype
* [new tag] develop-2023-05-14 -> develop-2023-05-14
* [new tag] develop-2023-05-18 -> develop-2023-05-18
* [new tag] develop-2023-05-21 -> develop-2023-05-21
* [new tag] develop-2023-06-04 -> develop-2023-06-04
* [new tag] develop-2023-06-11 -> develop-2023-06-11
* [new tag] develop-2023-06-25 -> develop-2023-06-25
* [new tag] develop-2023-07-02 -> develop-2023-07-02
* [new tag] develop-2023-07-09 -> develop-2023-07-09
* [new tag] develop-2023-07-23 -> develop-2023-07-23
* [new tag] develop-2023-07-30 -> develop-2023-07-30
* [new tag] develop-2023-08-06 -> develop-2023-08-06
* [new tag] develop-2023-08-13 -> develop-2023-08-13
* [new tag] develop-2023-08-20 -> develop-2023-08-20
* [new tag] develop-2023-08-27 -> develop-2023-08-27
* [new tag] develop-2023-09-03 -> develop-2023-09-03
* [new tag] develop-2023-09-10 -> develop-2023-09-10
hint: You have divergent branches and need to specify how to reconcile them.
hint: You can do so by running one of the following commands sometime before
hint: your next pull:
hint:
hint: git config pull.rebase false # merge
hint: git config pull.rebase true # rebase
hint: git config pull.ff only # fast-forward only
hint:
hint: You can replace "git config" with "git config --global" to set a default
hint: preference for all repositories. You can also pass --rebase, --no-rebase,
hint: or --ff-only on the command line to override the configured default per
hint: invocation.
fatal: Need to specify how to reconcile divergent branches.
[spack_cpu@exp-15-56 b]$ git config pull.rebase true
[spack_cpu@exp-15-56 b]$ git pull
Successfully rebased and updated refs/heads/sdsc-0.17.3.
[spack_cpu@exp-15-56 b]$ git stash pop
No stash entries found.
[spack_cpu@exp-15-56 b]$ git log
commit c1ff2f6acc2feb86ca332059dede7ef723f0a7d9 (HEAD -> sdsc-0.17.3, origin/sdsc-0.17.3, origin/HEAD)
Author: Marty Kandes <mkandes@sdsc.edu>
Date: Mon Sep 25 13:29:18 2023 -0700
Add lammps@20210310 % gcc@8.4.0 ^openmp@4.1.3 ^cuda@10.2.89 to exp/0.17.3/gpu/b
We'd previously come across runtime problems [1] after building this
version of LAMMPS against the gcc@10.2.0 ^openmpi@4.1.3 ^cuda@11.2.2
package dependency chain available in expanse/0.17.3/gpu/b. However,
after moving to the new gcc@8.4.0 ^openmpi@4.1.3 ^cuda@10.2.89 package
dependnecy chain deployed for amber@22, this has resolved the runtime
problems. The standard LJ benchmarks are now running successfully on
both 1 and 4 NVIDIA V100 GPUs.
[1]
Setting up Verlet run ...
Unit style : lj
Current step : 10
Time step : 0.005
[spack_cpu@exp-15-56 b]$
gromacs@2022.6 appears to have been built and installed within expanse/0.17.3/cpu/b successfully. However, there was an indexing error when rebuilidng the module environment. And, perhaps more importantly, mpich was installed too?
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/hwloc-2.6.0-7rqkdv4vgf63waqaftjer77mqpbwrrok
==> Installing mpich-3.4.2-casioch3cmacbuoabpjerqc42gtrjogr
==> No binary for mpich-3.4.2-casioch3cmacbuoabpjerqc42gtrjogr found: installing from source
==> Fetching https://mirror.spack.io/_source-cache/archive/5c/5c19bea8b84e8d74cca5f047e82b147ff3fba096144270e3911ad623d6c587bf.tar.gz
==> No patches needed for mpich
==> mpich: Executing phase: 'autoreconf'
==> mpich: Executing phase: 'configure'
==> mpich: Executing phase: 'build'
==> mpich: Executing phase: 'install'
==> mpich: Successfully installed mpich-3.4.2-casioch3cmacbuoabpjerqc42gtrjogr
Fetch: 0.30s. Build: 6m 36.82s. Total: 6m 37.12s.
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/mpich-3.4.2-casioch3cmacbuoabpjerqc42gtrjogr
==> Installing gromacs-2022.6-udrmhrs24qj7vofczpvyaoitkcrzvjst
==> No binary for gromacs-2022.6-udrmhrs24qj7vofczpvyaoitkcrzvjst found: installing from source
==> Fetching https://ftp.gromacs.org/gromacs/gromacs-2022.6.tar.gz
==> Ran patch() for gromacs
==> gromacs: Executing phase: 'cmake'
==> gromacs: Executing phase: 'build'
==> gromacs: Executing phase: 'install'
==> gromacs: Successfully installed gromacs-2022.6-udrmhrs24qj7vofczpvyaoitkcrzvjst
Fetch: 6.89s. Build: 6m 41.13s. Total: 6m 48.02s.
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/gromacs-2022.6-udrmhrs24qj7vofczpvyaoitkcrzvjst
real 820.00
user 2844.46
sys 6374.86
==> Warning: Could not write module file [/cm/shared/apps/spack/0.17.3/cpu/b/share/spack/lmod/linux-rocky8-x86_64/openmpi/4.1.3-oq3qvsv/gcc/10.2.0/amber/22/c6gwmih-omp.lua]
==> Warning: --> list index out of range <--
==> Regenerating lmod module files
==> OpenFOAM bashrc env: /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/aocc-3.2.0/openfoam-2106-jz42us227mirxrhqjvojlaut2giuh74j/etc/bashrc
[spack_cpu@exp-15-56 openmpi@4.1.3]$ spack find -lvd gromacs@2022.6
==> 1 installed package
-- linux-rocky8-zen2 / gcc@10.2.0 -------------------------------
udrmhrs gromacs@2022.6+blas~cuda~cycle_subcounters~double+hwloc~ipo+lapack~mdrun_only+mpi~nosuffix~opencl+openmp~plumed~relaxed_double_precision+shared~sycl build_type=RelWithDebInfo
qogw3ss fftw@3.3.10~mpi~openmp~pfft_patches precision=double,float
7rqkdv4 hwloc@2.6.0~cairo~cuda~gl~libudev+libxml2~netloc~nvml~opencl+pci~rocm+shared
ykynzrw libpciaccess@0.16
mgovjpj libxml2@2.9.12~python
zduoj2d libiconv@1.16 libs=shared,static
paz7hxz xz@5.2.5~pic libs=shared,static
ws4iari zlib@1.2.11+optimize+pic+shared
5lhvslt ncurses@6.2~symlinks+termlib abi=none
casioch mpich@3.4.2~argobots+fortran+hwloc+hydra+libxml2+pci+romio~slurm~verbs+wrapperrpath device=ch4 netmod=ofi pmi=pmi
62bd6km libfabric@1.13.2~debug~kdreg fabrics=sockets,tcp,udp
fgk2tlu openblas@0.3.18~bignuma~consistent_fpcsr~ilp64+locking+pic+shared threads=none
[spack_cpu@exp-15-56 openmpi@4.1.3]$
[spack_cpu@exp-15-56 openmpi@4.1.3]$ spack uninstall mpich@3.4.2 % gcc@10.2.0
==> Will not uninstall mpich@3.4.2%gcc@10.2.0/casioch
The following packages depend on it:
-- linux-rocky8-zen2 / gcc@10.2.0 -------------------------------
udrmhrs gromacs@2022.6
==> Error: There are still dependents.
use `spack uninstall --dependents` to remove dependents too
[spack_cpu@exp-15-56 openmpi@4.1.3]$ spack uninstall --dependents mpich@3.4.2 % gcc@10.2.0
==> The following packages will be uninstalled:
-- linux-rocky8-zen2 / gcc@10.2.0 -------------------------------
udrmhrs gromacs@2022.6 casioch mpich@3.4.2
==> Do you want to proceed? [y/N] y
==> Successfully uninstalled gromacs@2022.6%gcc@10.2.0+blas~cuda~cycle_subcounters~double+hwloc~ipo+lapack~mdrun_only+mpi~nosuffix~opencl+openmp~plumed~relaxed_double_precision+shared~sycl build_type=RelWithDebInfo arch=linux-rocky8-zen2/udrmhrs
==> Successfully uninstalled mpich@3.4.2%gcc@10.2.0~argobots+fortran+hwloc+hydra+libxml2+pci+romio~slurm~verbs+wrapperrpath device=ch4 netmod=ofi pmi=pmi arch=linux-rocky8-zen2/casioch
[spack_cpu@exp-15-56 openmpi@4.1.3]$
Let's try that again.
[spack_cpu@exp-15-56 openmpi@4.1.3]$ sbatch gromacs@2022.6.sh
Submitted batch job 25387651
[spack_cpu@exp-15-56 openmpi@4.1.3]$ cat gromacs@2022.6.sh
#!/usr/bin/env bash
#SBATCH --job-name=gromacs@2022.6
#SBATCH --account=use300
#SBATCH --reservation=root_73
#SBATCH --partition=ind-shared
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=1
#SBATCH --cpus-per-task=16
#SBATCH --mem=32G
#SBATCH --time=00:30:00
#SBATCH --output=%x.o%j.%N
declare -xr LOCAL_TIME="$(date +'%Y%m%dT%H%M%S%z')"
declare -xir UNIX_TIME="$(date +'%s')"
declare -xr LOCAL_SCRATCH_DIR="/scratch/${USER}/job_${SLURM_JOB_ID}"
declare -xr TMPDIR="${LOCAL_SCRATCH_DIR}"
declare -xr SYSTEM_NAME='expanse'
declare -xr SPACK_VERSION='0.17.3'
declare -xr SPACK_INSTANCE_NAME='cpu'
declare -xr SPACK_INSTANCE_VERSION='b'
declare -xr SPACK_INSTANCE_DIR="/cm/shared/apps/spack/${SPACK_VERSION}/${SPACK_INSTANCE_NAME}/${SPACK_INSTANCE_VERSION}"
declare -xr SLURM_JOB_SCRIPT="$(scontrol show job ${SLURM_JOB_ID} | awk -F= '/Command=/{print $2}')"
declare -xr SLURM_JOB_MD5SUM="$(md5sum ${SLURM_JOB_SCRIPT})"
declare -xr SCHEDULER_MODULE='slurm'
echo "${UNIX_TIME} ${SLURM_JOB_ID} ${SLURM_JOB_MD5SUM} ${SLURM_JOB_DEPENDENCY}"
echo ""
cat "${SLURM_JOB_SCRIPT}"
module purge
module load "${SCHEDULER_MODULE}"
module list
. "${SPACK_INSTANCE_DIR}/share/spack/setup-env.sh"
declare -xr SPACK_PACKAGE='gromacs@2022.6'
declare -xr SPACK_COMPILER='gcc@10.2.0'
declare -xr SPACK_VARIANTS='+blas ~cuda ~cycle_subcounters ~double +hwloc ~ipo +lapack ~mdrun_only +mpi ~nosuffix ~opencl +openmp ~plumed ~relaxed_double_precision +shared ~sycl'
declare -xr SPACK_DEPENDENCIES="^openblas@0.3.18/$(spack find --format '{hash:7}' openblas@0.3.18 % ${SPACK_COMPILER} ~ilp64 threads=none) ^fftw@3.3.10/$(spack find --format '{hash:7}' fftw@3.3.10 % ${SPACK_COMPILER} ~mpi ~openmp) ^openmpi@4.1.3/$(spack find --format '{hash:7}' openmpi@4.1.3 % ${SPACK_COMPILER})"
declare -xr SPACK_SPEC="${SPACK_PACKAGE} % ${SPACK_COMPILER} ${SPACK_VARIANTS} ${SPACK_DEPENDENCIES}"
printenv
spack config get compilers
spack config get config
spack config get mirrors
spack config get modules
spack config get packages
spack config get repos
spack config get upstreams
time -p spack spec --long --namespaces --types "${SPACK_SPEC}"
if [[ "${?}" -ne 0 ]]; then
echo 'ERROR: spack concretization failed.'
exit 1
fi
time -p spack install --jobs "${SLURM_CPUS_PER_TASK}" --fail-fast --yes-to-all "${SPACK_SPEC}"
if [[ "${?}" -ne 0 ]]; then
echo 'ERROR: spack install failed.'
exit 1
fi
spack module lmod refresh --delete-tree -y
#sbatch --dependency="afterok:${SLURM_JOB_ID}" 'lammps@20210310.sh'
sleep 30
[spack_cpu@exp-15-56 openmpi@4.1.3]$
Fixed the amber@22 module refresh by uninstalling and reinstalling the package. Note, however, this required reverting to the original version of the custom sdsc package installed in expanse/0.17.3/cpu/b, which was different from the newer version of the packaged recently used for amber@22 in expanse/0.17.3/gpu/b. These packages and installation options need to be reconciled in a more general way with the spack/spack upstream package as well, which has better handling of the regularly released amber patches.
[spack_cpu@exp-15-56 amber]$ !6521
. /cm/shared/apps/spack/0.17.3/cpu/b/share/spack/setup-env.sh
[spack_cpu@exp-15-56 amber]$ spack -d module lmod refresh -y amber@22
==> [2023-09-26-15:39:37.122923] Imported module from built-in commands
==> [2023-09-26-15:39:37.126099] Imported module from built-in commands
==> [2023-09-26-15:39:37.127164] Reading config file /cm/shared/apps/spack/0.17.3/cpu/b/etc/spack/defaults/config.yaml
==> [2023-09-26-15:39:37.144881] Reading config file /home/spack_cpu/.spack/config.yaml
==> [2023-09-26-15:39:37.146405] Reading config file /cm/shared/apps/spack/0.17.3/cpu/b/etc/spack/defaults/bootstrap.yaml
==> [2023-09-26-15:39:37.154910] DATABASE LOCK TIMEOUT: 3s
==> [2023-09-26-15:39:37.154950] PACKAGE LOCK TIMEOUT: No timeout
==> [2023-09-26-15:39:37.259169] Reading config file /cm/shared/apps/spack/0.17.3/cpu/b/etc/spack/defaults/repos.yaml
==> [2023-09-26-15:39:37.261374] Reading config file /cm/shared/apps/spack/0.17.3/cpu/b/etc/spack/repos.yaml
==> [2023-09-26-15:39:39.444475] Reading config file /cm/shared/apps/spack/0.17.3/cpu/b/etc/spack/defaults/modules.yaml
==> [2023-09-26-15:39:39.453551] Reading config file /cm/shared/apps/spack/0.17.3/cpu/b/etc/spack/defaults/linux/modules.yaml
==> [2023-09-26-15:39:39.456100] Reading config file /cm/shared/apps/spack/0.17.3/cpu/b/etc/spack/modules.yaml
==> [2023-09-26-15:39:40.196397] Warning: Skipping package at /cm/shared/apps/spack/0.17.3/cpu/b/var/spack/repos/sdsc/packages/amber.configure. "amber.configure" is not a valid Spack module name.
==> [2023-09-26-15:39:40.225620] Regenerating lmod module files
==> [2023-09-26-15:39:40.526482] WRITE: amber@22%gcc@10.2.0~cuda+mpi+openmp+update cuda_arch=none arch=linux-rocky8-zen2/c6gwmih [/cm/shared/apps/spack/0.17.3/cpu/b/share/spack/lmod/linux-rocky8-x86_64/openmpi/4.1.3-oq3qvsv/gcc/10.2.0/amber/22/c6gwmih-omp.lua]
==> [2023-09-26-15:39:40.749580] Reading config file /cm/shared/apps/spack/0.17.3/cpu/b/etc/spack/compilers.yaml
==> [2023-09-26-15:39:40.911868] '/cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/python-3.8.12-7zdjza7xxfccxr5syuaomd2fpvp2x35i/bin/python3.8' '-c' '
import json
from distutils.sysconfig import (
get_config_vars,
get_config_h_filename,
get_makefile_filename,
get_python_inc,
get_python_lib,
)
config = get_config_vars()
config['"'"'config_h_filename'"'"'] = get_config_h_filename()
config['"'"'makefile_filename'"'"'] = get_makefile_filename()
config['"'"'python_inc'"'"'] = {}
config['"'"'python_lib'"'"'] = {}
for plat_specific in [True, False]:
plat_key = str(plat_specific).lower()
config['"'"'python_inc'"'"'][plat_key] = get_python_inc(plat_specific, prefix='"'"''"'"')
config['"'"'python_lib'"'"'][plat_key] = {}
for standard_lib in [True, False]:
lib_key = str(standard_lib).lower()
config['"'"'python_lib'"'"'][plat_key][lib_key] = get_python_lib(
plat_specific, standard_lib, prefix='"'"''"'"'
)
print(json.dumps(config))
'
==> [2023-09-26-15:39:41.113364] BLACKLISTED_AS_IMPLICIT : zlib@1.2.11%gcc@10.2.0+optimize+pic+shared arch=linux-rocky8-zen2/ws4iari
==> [2023-09-26-15:39:41.190308] Reading config file /cm/shared/apps/spack/0.17.3/cpu/b/etc/spack/defaults/packages.yaml
==> [2023-09-26-15:39:41.217248] Reading config file /cm/shared/apps/spack/0.17.3/cpu/b/etc/spack/packages.yaml
[spack_cpu@exp-15-56 amber]$
gromacs@2022.6 is now available in expanse/0.17.3/cpu/b and ready for testing.
[mkandes@login02 ~]$ module spider gromacs/2022.6/sbq2qrc-omp
----------------------------------------------------------------------------
gromacs/2022.6: gromacs/2022.6/sbq2qrc-omp
----------------------------------------------------------------------------
You will need to load all module(s) on any one of the lines below before the "gromacs/2022.6/sbq2qrc-omp" module is available to load.
cpu/0.17.3b gcc/10.2.0/npcyll4 openmpi/4.1.3/oq3qvsv
Help:
GROMACS (GROningen MAchine for Chemical Simulations) is a molecular
dynamics package primarily designed for simulations of proteins, lipids
and nucleic acids. It was originally developed in the Biophysical
Chemistry department of University of Groningen, and is now maintained
by contributors in universities and research centers across the world.
GROMACS is one of the fastest and most popular software packages
available and can run on CPUs as well as GPUs. It is free, open source
released under the GNU General Public License. Starting from version
4.6, GROMACS is released under the GNU Lesser General Public License.
[mkandes@login02 ~]$
Standard acceptance benchmarks ran successfully. Deployment complete with standard output and build spec committed back to sdsc/spack repo sdsc-0.17.3 deployment branch. https://github.com/sdsc/spack/commit/778f7ffa1cae4983af7c4df8c10c78f70e01fbb5
https://sdsc.zendesk.com/agent/tickets/29944 https://sdsc.zendesk.com/agent/tickets/26289