Open KateFriedman-NOAA opened 2 years ago
Some comments about a few of these packages:
cmake/3.22.1
and Hera cmake/3.20.1
. I would expect them to be sufficient.png
and have 1.6.35
instead of 1.6.37
. We can install 1.6.37 if necessary.Otherwise the other packages can be installed.
@KateFriedman-NOAA
wgrib2/2.0.7 (I believe UPP needed this on WCOSS2 @WenMeng-NOAA can you confirm?) That's the correct version used by GFS V16.
In addition, for this purpose, on Hera, the compiler will still be intel/18.0.5.274. We are also adding an installation of stack with intel/2021.
On Tue, Feb 8, 2022 at 1:40 PM Kyle Gerheiser @.***> wrote:
Some comments about a few of these packages:
- gempack - that's not something we have a package for
- cmake - we don't usually handle this and prefer to use the system CMake. Orion has cmake/3.22.1 and Hera cmake/3.20.1. I would expect them to be sufficient.
- libpng - We call the module png and have 1.6.35instead of1.6.37`. We can install 1.6.37 if necessary.
Otherwise the other packages can be installed.
— Reply to this email directly, view it on GitHub https://github.com/NOAA-EMC/hpc-stack/issues/379#issuecomment-1032939838, or unsubscribe https://github.com/notifications/unsubscribe-auth/AKWSMFBDAZOHNOBYKYP64U3U2FPSRANCNFSM5N3IETPA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.
You are receiving this because you were assigned.Message ID: @.***>
I have been working on UPP changes. I will provide the UPP tag for installing upp/8.1.0 when it is ready.
Thanks guys! We want to match the libraries in operations that will be on WCOSS2 with the ones on HERA and Orion (plus the additional libraries we need on the development platforms for the next gen systems) under hpc-stack so we can remove the old installations. I dont think we have to worry about gempak as that is not run in dev mode
We will live with intel/18.0.5.274 for now. And intel/2021 when available
Thanks !
The variable "Z_LIB" is not defined in module zlib/1.2.11 on the hpc-stack on Hera.
Some comments about a few of these packages:
- gempack - that's not something we have a package for
Noted, thanks! Won't need based on what @arunchawla-NOAA said above.
- cmake - we don't usually handle this and prefer to use the system CMake. Orion has
cmake/3.22.1
and Heracmake/3.20.1
. I would expect them to be sufficient.
Also noted, thanks!
- libpng - We call the module
png
and have1.6.35
instead of1.6.37
. We can install 1.6.37 if necessary.
Yes, please install 1.6.37
. Looks like the NCO hpc-stack install calls this libpng
to make life more interesting. I will adjust the module name for the non-WCOSS2 platforms to be png
.
Otherwise the other packages can be installed.
Excellent, thanks!
The official name for the PNG library is libpng (http://www.libpng.org/pub/png/libpng.html), spack also knows it by this name. Maybe png
should be changed to libpng
in hpc-stack and the codes that use it.
@climbfuji that's a pretty good idea. Just re-name it libpng, install it as a new package and use that.
I would prefer it be renamed to libpng as well, since that is what we have on WCOSS2 and I'd prefer to use the same name everywhere. Thanks!
Based on the discussions above, may I confirm all modules across different computers must have the same name and version numbers? In other words, will we use the same build.ver and run.ver for all computers?
In other words, will we use the same build.ver and run.ver for all computers?
@YaliMao-NOAA Yes, we will use the same build.ver and run.ver files for all machines but we will need to accommodate some differences. Therefore, for the workflow side, I am now creating $target.ver files (e.g. orion.ver) which will set the hpc-stack module variables/versions to support those machines. I am going to source them after sourcing build.ver or run.ver, so the stack module names/versions get set for the specific machine. Here is what I just sent in the email thread about updating our Hera/Orion modulefiles to LUA format and hpc-stack:
orion.ver:
export hpc_ver=1.1.0
export hpc_intel_ver=2018.4
export hpc_impi_ver=2018.4
hera.ver:
export hpc_ver=1.1.0
export hpc_intel_ver=18.0.5.274
export hpc_impi_ver=2018.0.4
The top of one of the workflow modulefiles will now look like this for Orion:
prepend_path("MODULEPATH", "/apps/contrib/NCEP/libs/hpc-stack/modulefiles/stack")
load(pathJoin("hpc", os.getenv("hpc_ver")))
load(pathJoin("hpc-intel", os.getenv("hpc_intel_ver")))
load(pathJoin("hpc-impi", os.getenv("hpc_impi_ver")))
...and then the rest of the module versions will be set, just as on WCOSS2.
@kgerheiser We also load the following modules on WCOSS2, are they from hpc-stack and if so, can we get these versions on Hera/Orion too? Thanks!
@KateFriedman-NOAA It's a smart solution to introduce $target.ver to differentiate the platforms. On Hera and Orion, will 'module purge' be kept not changed to 'module reset'?
On Hera and Orion, will 'module purge' be kept not changed to 'module reset'?
@YaliMao-NOAA Yes, module reset
is only for WCOSS2 right now. Please keep using module purge
elsewhere. Thanks!
The module zlib/1.2.11 settings on WCOSS2 and Orion/Hera are not consistent. WCOSS2:
Wen.Meng@dlogin02 ~$ module show zlib/1.2.11
-----------------------------------------------------------------------------------------
/apps/prod/lmodules/intel/19.1.3.304/zlib/1.2.11:
-----------------------------------------------------------------------------------------
whatis("A free, general-purpose, legally unencumbered lossless data-compression library. ")
prepend_path("MANPATH","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh/share/man")
prepend_path("LIBRARY_PATH","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh/lib")
prepend_path("LD_LIBRARY_PATH","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh/lib")
prepend_path("C_INCLUDE_PATH","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh/include")
prepend_path("CPLUS_INCLUDE_PATH","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh/include")
prepend_path("INCLUDE","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh/include")
prepend_path("PKG_CONFIG_PATH","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh/lib/pkgconfig")
prepend_path("CMAKE_PREFIX_PATH","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh/")
setenv("ZLIB_ROOT","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh")
setenv("ZLIB_INC","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh/include")
setenv("ZLIB_LIB","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh/lib/libz.a")
setenv("ZLIB_LIBDIR","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh/lib")
setenv("ZLIB_VER","1.2.11")
setenv("ZLIB_SRC","/apps/spack/zlib/1.2.11/intel/19.1.3.304/hjotqkckeoyt6j6tibalwzrlfljcjtdh/src")
Hera:
[Wen.Meng@hfe10 modulefiles]$ module show zlib/1.2.11
--------------------------------------------------------------------------------------------------
/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/modulefiles/compiler/intel/18.0.5.274/zlib/1.2.11.lua:
--------------------------------------------------------------------------------------------------
help([[]])
conflict("zlib")
prepend_path("LD_LIBRARY_PATH","/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/intel-18.0.5.274/zlib/1.2.11/lib")
prepend_path("DYLD_LIBRARY_PATH","/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/intel-18.0.5.274/zlib/1.2.11/lib")
prepend_path("CPATH","/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/intel-18.0.5.274/zlib/1.2.11/include")
prepend_path("MANPATH","/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/intel-18.0.5.274/zlib/1.2.11/share/man")
setenv("ZLIB_ROOT","/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/intel-18.0.5.274/zlib/1.2.11")
setenv("ZLIB_INCLUDES","/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/intel-18.0.5.274/zlib/1.2.11/include")
setenv("ZLIB_LIBRARIES","/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/intel-18.0.5.274/zlib/1.2.11/lib")
setenv("ZLIB_VERSION","1.2.11")
whatis("Name: zlib")
whatis("Version: 1.2.11")
whatis("Category: library")
whatis("Description: Zlib library")
The UPP for GFSV16 needs "ZLIB_LIB" for GNU makefile building.
@WenMeng-NOAA you are looking at the wrong module on Orion (/apps/prod/lmodules/intel/19.1.3.304/zlib/1.2.11
) compared to the hpc-stack zlib.
@kgerheiser On Orion, Here are what I check on Orion:
[wmeng@Orion-login-3 ~]$ module use /apps/contrib/NCEP/libs/hpc-stack/modulefiles/stack
[wmeng@Orion-login-3 ~]$ module load hpc/1.1.0
[wmeng@Orion-login-3 ~]$ module load hpc-intel/2018.4
[wmeng@Orion-login-3 ~]$ module load hpc-impi/2018.4
[wmeng@Orion-login-3 ~]$ module show zlib/1.2.11
--------------------------------------------------------------------------------------------
/apps/contrib/NCEP/libs/hpc-stack/modulefiles/compiler/intel/2018.4/zlib/1.2.11.lua:
--------------------------------------------------------------------------------------------
help([[]])
conflict("zlib")
prepend_path("LD_LIBRARY_PATH","/apps/contrib/NCEP/libs/hpc-stack/intel-2018.4/zlib/1.2.11/lib")
prepend_path("DYLD_LIBRARY_PATH","/apps/contrib/NCEP/libs/hpc-stack/intel-2018.4/zlib/1.2.11/lib")
prepend_path("CPATH","/apps/contrib/NCEP/libs/hpc-stack/intel-2018.4/zlib/1.2.11/include")
prepend_path("MANPATH","/apps/contrib/NCEP/libs/hpc-stack/intel-2018.4/zlib/1.2.11/share/man")
setenv("ZLIB_ROOT","/apps/contrib/NCEP/libs/hpc-stack/intel-2018.4/zlib/1.2.11")
setenv("ZLIB_INCLUDES","/apps/contrib/NCEP/libs/hpc-stack/intel-2018.4/zlib/1.2.11/include")
setenv("ZLIB_LIBRARIES","/apps/contrib/NCEP/libs/hpc-stack/intel-2018.4/zlib/1.2.11/lib")
setenv("ZLIB_VERSION","1.2.11")
whatis("Name: zlib")
whatis("Version: 1.2.11")
whatis("Category: library")
whatis("Description: Zlib library")
Please let me know the correct path. Thanks!
@kgerheiser The UPP is looking for variable "ZLIB_LIB" for GNU makefile building. It is defined on WCOSS2 but not on Orion/Hera.
@kgerheiser Similar to Wen's comment, I need JASPER_LIB
and PNG_LIB
on Orion (and likely Hera). There may be more, I'm still working through builds on Orion. We need the same module variables as WCOSS2 stack versions.
It looks like the external packages have not been installed using hpc-stack but using spack
The UPP also got building failure in netcdf as:
/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/intel-18.0.5.274/impi-2018.0.4/netcdf/4.7.4/lib/libnetcdf.a(libnchdf5_la-hdf5file.o): In function `nc4_enddef_netcdf4_file':
hdf5file.c:(.text+0x3ad): undefined reference to `H5Fflush'
/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/intel-18.0.5.274/impi-2018.0.4/netcdf/4.7.4/lib/libnetcdf.a(libnchdf5_la-hdf5file.o): In function `NC4_sync':
hdf5file.c:(.text+0x73b): undefined reference to `H5Fflush'
/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/intel-18.0.5.274/impi-2018.0.4/netcdf/4.7.4/lib/libnetcdf.a(libnchdf5_la-hdf5file.o): In function `nc4_close_hdf5_file':
hdf5file.c:(.text+0xaa4): undefined reference to `H5Fflush'
/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/intel-18.0.5.274/impi-2018.0.4/netcdf/4.7.4/lib/libnetcdf.a(libnchdf5_la-hdf5file.o): In function `nc4_close_netcdf4_file':
My working version on hera is at /scratch1/NCEPDEV/stmp2/Wen.Meng/post_gfsv16_hpc/UPP/sorc. The GNU make file is /scratch1/NCEPDEV/stmp2/Wen.Meng/post_gfsv16_hpc/UPP/sorc/ncep_post.fd/makefile_module_hpc. The build script is /scratch1/NCEPDEV/stmp2/Wen.Meng/post_gfsv16_hpc/UPP/sorc/build_ncep_post.sh. Please advise the fix. Thanks!
@WenMeng-NOAA , @KateFriedman-NOAA solved this problem, reach out to her.
@WenMeng-NOAA Not sure if it's related but I had to make the following change to the gaussian_sfcanl makefile to get it building on Orion with the static libraries there: https://github.com/KateFriedman-NOAA/global-workflow/commit/de03082baea517dcb7759b4d24d94e5bb865a55d
@KateFriedman-NOAA our solution works for UPP. Thanks!
@kgerheiser Is it possible to get these two modules available on Hera and Orion too?
bufr_dump/2.0.0
util_shared/1.4.0
Given the time line of submitting new tag with changes of using NCEPLIBs from the hpc-stack on Hera/Orion, I would like to know the status of updating the hpc-stack on Hera/Orion with:
Thanks!
On Hera and Orion: 1) How to rename png as libpng on WCOSS2? 2) g2/3.4.5 is not installed, can g2/3.4.3 be used instead? 3) w3emc/2.9.2 is not installed, can w3emc/2.9.1 be used instead?
On Hera, netcdf/4.7.4 is not installed, the installed 4.7.4 version is netcdf_parallel, not netcdf.
@YaliMao-NOAA are you using hpc-stack? netcdf_parallel
isn't one of our libraries, and netcdf/4.7.4
is definitely installed.
@KateFriedman-NOAA I'm working on getting the libraries/modules ready, and I think we can have them installed within the next day or so.
@kgerheiser Is it possible to get these two modules available on Hera and Orion too?
bufr_dump/2.0.0 util_shared/1.4.0
We don't have packages for these. If they are something you need, we could probably add them. Where do they come from?
@KateFriedman-NOAA I'm having issues building NCO 4.7.9 (#390, #391). I'm actually surprised that NCO was able to build NCO (heh) with these version combinations.
Would it be possible to use a different version? I think I can hack around it if necessary, but it'll take me a little while.
I'm having issues building NCO 4.7.9....Would it be possible to use a different version?
I'll check on this and let you know, thanks!
bufr_dump/2.0.0 util_shared/1.4.0 We don't have packages for these. If they are something you need, we could probably add them. Where do they come from?
Will check on these too and get back to you...caught up in some WCOSS2 issues today so may take a day or so to check.
I'm having issues building NCO 4.7.9 (#390, #391). I'm actually surprised that NCO was able to build NCO (heh) with these version combinations. Would it be possible to use a different version? I think I can hack around it if necessary, but it'll take me a little while.
So we are using NCO/4.7.0
in ops on WCOSS-Dell right now...is that version easier to build?
@JessicaMeixner-NOAA The waveprep jobs load the nco module...does the module version matter? I see nco/4.8.1
and nco/4.9.3
available on Orion and we're currently using nco/4.8.1
in feature/ops-orion
so my guess is that version might be ok to use for now to support the dev_v16
branch (?). Let me know if there is someone else I should refer these nco
module questions to? Thanks!
what is util-shared ? is that the old library ?
bufr_dump/2.0.0 util_shared/1.4.0 We don't have packages for these. If they are something you need, we could probably add them. Where do they come from?
@BoiVuong-NOAA @EdwardSafford-NOAA @YaliMao-NOAA The jgfs_wave_prdgen_bulls
job, wafs_blending
jobs, and GSI monitoring jobs load the util_shared
module on WCOSS2. Do you know where that came from? Thanks!
@YaliMao-NOAA The wafs_gcip
job loads the bufr_dump
module on WCOSS2...do you know where that module came from and if it's required to support WAFS elsewhere? Thanks!
what is util-shared ? is that the old library ?
@arunchawla-NOAA On WCOSS2 it sets the following:
> module show util_shared/1.4.0
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------
/apps/ops/prod/nco/modulefiles/compiler/intel/19.1.3.304/util_shared/1.4.0:
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------
whatis("This module sets the environment variables for non-nco managed production utilities: ")
conflict("util_shared")
setenv("UTILSHAREDROOT","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0")
setenv("USHshared","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/ush")
setenv("FIXshared","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/fix")
setenv("PARMshared","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/parm")
setenv("ANOMGB","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/anomgb")
setenv("BULL2SEQ","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/bull2seq")
setenv("BULLPROI","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/bullproi")
setenv("SUPVIT","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/supvit")
setenv("MKGRB25","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/mkgrb25")
setenv("ASCII2SHP","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/ascii2shp")
setenv("OVERGRIDID","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/overgridid")
setenv("OVERDATEGRIB","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/overdate.grib")
setenv("DEBUFR","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/debufr")
setenv("CWORDSH","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/cwordsh")
setenv("UKAUXFLD","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/ukauxfld")
setenv("UNPMGRB1","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/unpmgrb1")
setenv("MK125FLS","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/mk125fls")
setenv("MK125FLW","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/mk125flw")
setenv("WEBTITLE","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/bin/webtitle")
prepend_path("PATH","/apps/ops/prod/nco/intel/19.1.3.304/util_shared.v1.4.0/ush")
help([[Set environment veriables for production utilities
]])
I'm having issues building NCO 4.7.9 (#390, #391). I'm actually surprised that NCO was able to build NCO (heh) with these version combinations. Would it be possible to use a different version? I think I can hack around it if necessary, but it'll take me a little while.
So we are using
NCO/4.7.0
in ops on WCOSS-Dell right now...is that version easier to build?@JessicaMeixner-NOAA The waveprep jobs load the nco module...does the module version matter? I see
nco/4.8.1
andnco/4.9.3
available on Orion and we're currently usingnco/4.8.1
infeature/ops-orion
so my guess is that version might be ok to use for now to support thedev_v16
branch (?). Let me know if there is someone else I should refer thesenco
module questions to? Thanks!
I don't know of a need for a particular version, as long as it runs, I'd say we're good to go. For wave related things I have the most trouble change versions of jasper and g2, not had issues w/nco versions before (but I guess there's always a first for everything).
Ok so for running experiments do we need anything from util_shared ? these seem to be old things that I would rather not carry on hpc-stack
util-shared is an utilities not a library.
I don't know of a need for a particular version, as long as it runs, I'd say we're good to go. For wave related things I have the most trouble change versions of jasper and g2, not had issues w/nco versions before (but I guess there's always a first for everything)
Ok, given that, for now I will set my hera.ver/orion.ver files to use existing nco
modules (most likely what feature/ops-orion
uses. We have the needed jasper
version available and @kgerheiser is installing the needed g2
module so we should be ok for those. Thanks @JessicaMeixner-NOAA !
Ok so for running experiments do we need anything from util_shared ? these seem to be old things that I would rather not carry on hpc-stack util-shared is an utilities not a library.
I agree, don't want to clutter hpc-stack up if it's not needed. If we drop support for WAFS and the bulk of the downstream jobs (e.g. gempak and awips) on the non-WCOSS2 platforms (at least for now) then the need for the bufr_dump
and util_shared
modules goes away. Do we want to make that decision now?
My only concern about not having bufr_dump
would be if obsproc needs it. @ShelleyMelchior-NOAA would we need a bufr_dump
module to support obsproc on Hera/Orion? Thanks!
we do not need to do gempak and awips grids on non wcoss platforms. In general I would like us to lose the dependency on util_shared
we do not need to do gempak and awips grids on non wcoss platforms. In general I would like us to lose the dependency on util_shared
Ok, that sorts out the need for util_shared
. @kgerheiser ignore the request for util_shared
, thanks!
@arunchawla-NOAA How about WAFS? I don't think we need WAFS off WCOSS2 either, although @YaliMao-NOAA has supported it building on Hera and Orion in recent versions. That would drop the need for bufr_dump
, assuming obsproc doesn't need it.
@KateFriedman-NOAA Thank you for bringing it up. util_shared is required to be able to use 'make_NTC_file.pl' to send out admin messages for blended WAFS products. bufr_dump is used for GCIP.
bufr_dump is only useful if the dump processing can run on a system. In order for dump processing to work there need to be live (or staged) bufr tanks. If there is plan for there to be tanks on Hera or Orion, and for dump files to be processed from those tanks, then you need bufr_dump.
FWIW, there are no plans from obsproc pov to do any dump processing on Hera or Orion.
that is correct, I do not think we need these on either HERA or Orion. Thanks @ShelleyMelchior-NOAA !
Does WAFS need to support real-time run on Hera and Orion? When I ran WAFS on Hera or Orion, I used to have a canned data set.
Not only live BUFR data, how about DCOM data? Will DCOM data be saved on Hera or Orion real time? Two WAFS blending jobs need UK unblended data at DCOM.
Does WAFS need to support real-time run on Hera and Orion? When I ran WAFS on Hera or Orion, I used to have a canned data set.
@YaliMao-NOAA I believe we've decide not to support WAFS in developer runs on non-WCOSS2 platforms, similar to dropping support for other downstream products like gempak and awips that aren't needed outside of ops. @arunchawla-NOAA can further confirm this. Thanks!
bufr_dump is only useful if the dump processing can run on a system.
Got it, we won't be running that off WCOSS2 so we won't need bufr_dump
on Hera/Orion. Thanks @ShelleyMelchior-NOAA !
In order to support the new GFSv16.2.0 (WCOSS2 port version) on Hera and Orion we need the same library module versions available. Below I list the versions that are currently being used in the new operational GFSv16.2.0 and which ones are missing on Hera/Orion.
Which software (and version) in the stack would you like installed?
Hera & Orion:
gempak/7.14.1cmake/3.20.2Which machines would you like to have the software installed?
Hera, Orion
Additional context
Here are the build.ver module versions for GFSv16.2.0: https://github.com/NOAA-EMC/global-workflow/blob/feature/ops-wcoss2/versions/build.ver
Refs: https://github.com/NOAA-EMC/global-workflow/issues/639