Closed SarahLu-NOAA closed 8 months ago
I check suite_FV3_GFS_v16.xml verus suite_FV3_GFS_v17_p8.xml. In v16, noah, cires_ugwp, and gfdl mp are used. In v17_p8, noahmp, unified_ugwp, and thompson mp are used. Perhaps static files, LBCs, or ICs are not staged properly in my v17_p8 experiments? LBCs/BCs are taken from GFS (gfs.t00z.pgrb2.0p25.f$FH)
Hi UFS/SRW experts,
I adopted the sample yaml (config.grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v17_p8_plot.yaml) from /ufs-srweather-app/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/. The modifications include 1) skip writing out restart files, 2) change the date from 2019-07-01 to 2019-06-1 due to file availability, and 3) specify user (MACHINE, ACCOUNT). My config.yaml can be found at derecho: /glade/work/clu/ufs/expt_dirs/rrfs_conus_sample.
I then modified this config to run different physics suites (rap, rrfs_v1beta and gfs_v16). The results (for 6-hour simulations) shows the runs with rap, rrfs_v1beta, and gfsv16 are similar but gfsv17_p7 is too cold.
@SarahLu-NOAA Thanks for bringing this to our attention. As you've pointed out, something is incorrect w/ the v17_p8 configuration. I will take a look at this later today and see what I can find.
@SarahLu-NOAA thank you very much for detailing this issue. EPIC's software integration team is currently looking into this and attempting to re-create the issue for debugging/analysis. we will be in touch with updates!
@SarahLu-NOAA @dustinswales @ligiabernardet we were able to reproduce the temp. discrepancy between the v16 and v17_p8 experiment configurations (i.e. RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v17_p8_plot
and RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_plot
) on hera for the default 20190701
date. the 2-m temp. at f006
is indeed colder for gfs_v17_p8 when compared to gfs_v16 (plots attached). we are still looking into namelist differences between the global and regional configs for gfs_v17_p8, as well as at various soil fields between v16/v17, to see what may be contributing to this issue.
the runs associated with these plots are located on hera in the following locations:
gfs_v16: /scratch1/NCEPDEV/stmp2/Cameron.Book/expt_dirs/grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_plot
gfs_v17_p8: /scratch1/NCEPDEV/stmp2/Cameron.Book/expt_dirs/grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v17_p8_plot
@SarahLu-NOAA
we took a look at srw.t00z.prslev.f000.rrfs_conus_25km.grib2
output from v17_p8 & v16 and compared the two over various fields at f000
after one physics time step. there is apparent divergence between v17_p8 and v16 in surface temperature, 100-200cm soil temp. & moisture, sensible heat flux, latent heat flux, and temp. at the lowest model level, as can be seen in the attached plots (via @RatkoVasic-NOAA). note that these experiments used the same initial conditions. we are still looking into the namelist differences between the experiment configurations.
@dustinswales @ligiabernardet based on these results, would someone be able to look further into the ccpp configuration for gfs_v17_p8?
again, the runs used to make these plots are located on hera:
gfs_v16: /scratch1/NCEPDEV/stmp2/Cameron.Book/expt_dirs/grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_plot
gfs_v17_p8: /scratch1/NCEPDEV/stmp2/Cameron.Book/expt_dirs/grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v17_p8_plot
@dustinswales @ligiabernardet fwiw, the differences between the gfs_v16 and gfs_v17_p8 regional (srw) namelists are
INFO - srw_v16.nml
INFO + srw.nml
INFO ---------------------------------------------------------------------
INFO atmos_model_nml: ccpp_suite: - FV3_GFS_v16 + FV3_GFS_v17_p8
INFO fv_core_nml: agrid_vel_rst: - False + True
INFO fv_core_nml: d2_bg_k2: - 0.0 + 0.04
INFO fv_core_nml: dnats: - 1 + 0
INFO fv_core_nml: do_sat_adj: - True + False
INFO fv_core_nml: do_vort_damp: - True + None
INFO fv_core_nml: dz_min: - 6 + 2
INFO fv_core_nml: full_zs_filter: - False + None
INFO fv_core_nml: hord_tr: - 10 + 8
INFO fv_core_nml: make_nh: - False + True
INFO fv_core_nml: n_zs_filter: - 0 + None
INFO fv_core_nml: na_init: - 0 + 1
INFO fv_core_nml: nord: - 3 + 2
INFO fv_core_nml: range_warn: - False + True
INFO gfdl_cloud_microphysics_nml: rthresh: - 1e-05 + 1e-06
INFO gfs_physics_nml: do_tofd: - True + False
INFO gfs_physics_nml: fhzero: - 1.0 + 6
INFO gfs_physics_nml: iaer: - 5111 + 1011
INFO gfs_physics_nml: ialb: - 1 + 2
INFO gfs_physics_nml: iems: - 1 + 2
INFO gfs_physics_nml: imp_physics: - 11 + 8
INFO gfs_physics_nml: iopt_alb: - 2 + 1
INFO gfs_physics_nml: iopt_crs: - 1 + 2
INFO gfs_physics_nml: iopt_dveg: - 1 + 4
INFO gfs_physics_nml: iopt_rad: - 1 + 3
INFO gfs_physics_nml: iopt_sfc: - 1 + 3
INFO gfs_physics_nml: iopt_stc: - 1 + 3
INFO gfs_physics_nml: ldiag_ugwp: - False + None
INFO gfs_physics_nml: lgfdlmprad: - True + False
INFO gfs_physics_nml: lheatstrg: - True + False
INFO gfs_physics_nml: lsm: - 1 + 2
INFO gfs_physics_nml: nsfullradar_diag: - 3600 + None
INFO gfs_physics_nml: sfclay_compute_flux: - False + None
INFO gfs_physics_nml: active_gases: - None + h2o_co2_o3_n2o_ch4_o2
INFO gfs_physics_nml: bl_mynn_edmf: - None + 1
INFO gfs_physics_nml: bl_mynn_edmf_mom: - None + 1
INFO gfs_physics_nml: bl_mynn_tkeadvect: - None + True
INFO gfs_physics_nml: cplchm: - None + False
INFO gfs_physics_nml: decfl: - None + 10
INFO gfs_physics_nml: do_gsl_drag_ls_bl: - None + False
INFO gfs_physics_nml: do_gsl_drag_ss: - None + True
INFO gfs_physics_nml: do_gsl_drag_tofd: - None + False
INFO gfs_physics_nml: do_mynnedmf: - None + False
INFO gfs_physics_nml: do_mynnsfclay: - None + False
INFO gfs_physics_nml: do_rrtmgp: - None + False
INFO gfs_physics_nml: do_ugwp_v0: - None + True
INFO gfs_physics_nml: do_ugwp_v0_nst_only: - None + False
INFO gfs_physics_nml: do_ugwp_v0_orog_only: - None + False
INFO gfs_physics_nml: do_ugwp_v1: - None + False
INFO gfs_physics_nml: do_ugwp_v1_orog_only: - None + False
INFO gfs_physics_nml: dogp_cldoptics_lut: - None + False
INFO gfs_physics_nml: dogp_lwscat: - None + False
INFO gfs_physics_nml: dt_inner: - None + 150
INFO gfs_physics_nml: frac_grid: - None + True
INFO gfs_physics_nml: gwd_opt: - None + 2
INFO gfs_physics_nml: icloud_bl: - None + 1
INFO gfs_physics_nml: lradar: - None + False
INFO gfs_physics_nml: lseaspray: - None + True
INFO gfs_physics_nml: lsoil_lsm: - None + 4
INFO gfs_physics_nml: ltaerosol: - None + False
INFO gfs_physics_nml: lw_file_clouds: - None + rrtmgp-cloud-optics-coeffs-lw.nc
INFO gfs_physics_nml: lw_file_gas: - None + rrtmgp-data-lw-g128-210809.nc
INFO gfs_physics_nml: min_lakeice: - None + 0.15
INFO gfs_physics_nml: min_seaice: - None + 0.15
INFO gfs_physics_nml: qdiag3d: - None + False
INFO gfs_physics_nml: ras: - None + False
INFO gfs_physics_nml: rrtmgp_nbandslw: - None + 16
INFO gfs_physics_nml: rrtmgp_nbandssw: - None + 14
INFO gfs_physics_nml: rrtmgp_ngptslw: - None + 128
INFO gfs_physics_nml: rrtmgp_ngptssw: - None + 112
INFO gfs_physics_nml: sedi_semi: - None + True
INFO gfs_physics_nml: sw_file_clouds: - None + rrtmgp-cloud-optics-coeffs-sw.nc
INFO gfs_physics_nml: sw_file_gas: - None + rrtmgp-data-sw-g112-210809.nc
INFO gfs_physics_nml: ttendlim: - None + -999
INFO namsfc: fsicl: - 99999 + 0
INFO namsfc: fsics: - 99999 + 0
INFO namsfc: landice: - True + False
where the -
entry corresponds to v16 and +
entry corresponds to v17_p8. this was generated using the UW config compare
tool.
differences between the global p8 namelist and the regional (srw) v17_p8 namelist are listed below (-
entries correspond to the global p8 namelist, while +
correspond to the regional namelist):
INFO - global_p8.nml
INFO + srw_v17_p8.nml
INFO ---------------------------------------------------------------------
INFO atmos_model_nml: blocksize: - 32 + 40
INFO diag_manager_nml: max_output_fields: - 300 + 450
INFO fms_nml: domains_stack_size: - 16000000 + 12000000
INFO fv_core_nml: k_split: - 2 + 6
INFO fv_core_nml: do_vort_damp: - True + None
INFO fv_core_nml: bc_update_interval: - None + 3
INFO fv_core_nml: do_schmidt: - None + True
INFO fv_core_nml: nrows_blend: - None + 10
INFO fv_core_nml: regional: - None + True
INFO fv_core_nml: write_restart_with_bcs: - None + False
INFO external_ic_nml: levp: - 128 + 65
INFO gfs_physics_nml: fhcyc: - 24 + 0.0
INFO gfs_physics_nml: progsigma: - False + None
INFO gfs_physics_nml: use_cice_alb: - False + None
INFO gfs_physics_nml: ldiag_ugwp: - False + None
INFO gfs_physics_nml: ngases: - 6 + None
INFO gfs_physics_nml: do_sppt: - False + None
INFO gfs_physics_nml: do_shum: - False + None
INFO gfs_physics_nml: do_skeb: - False + None
INFO gfs_physics_nml: cplflx: - False + None
INFO gfs_physics_nml: cplice: - False + None
INFO gfs_physics_nml: cplwav: - False + None
INFO gfs_physics_nml: cplwav2atm: - False + None
INFO gfs_physics_nml: do_ca: - True + None
INFO gfs_physics_nml: ca_global: - False + None
INFO gfs_physics_nml: ca_sgs: - True + None
INFO gfs_physics_nml: nca: - 1 + None
INFO gfs_physics_nml: ncells: - 5 + None
INFO gfs_physics_nml: nlives: - 12 + None
INFO gfs_physics_nml: nseed: - 1 + None
INFO gfs_physics_nml: nfracseed: - 0.5 + None
INFO gfs_physics_nml: nthresh: - 18 + None
INFO gfs_physics_nml: ca_trigger: - True + None
INFO gfs_physics_nml: nspinup: - 1 + None
INFO gfs_physics_nml: iseed_ca: - 1437671814 + None
INFO gfdl_cloud_microphysics_nml: rthresh: - 1e-05 + 1e-06
INFO namsfc: fnalbc: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/orog/C384.mx025_frac/fix_sfc/C384.snowfree_albedo.tileX.nc + None
INFO namsfc: fnalbc2: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/orog/C384.mx025_frac/fix_sfc/C384.facsf.tileX.nc + None
INFO namsfc: fnaisc: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/am/IMS-NIC.blended.ice.monthly.clim.grb + ../../../../../nems/role.epic/UFS_SRW_data/v2p0/fix/fix_am/CFSR.SEAICE.1982.2012.monthly.clim.grb
INFO namsfc: fntg3c: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/orog/C384.mx025_frac/fix_sfc/C384.substrate_temperature.tileX.nc + None
INFO namsfc: fnvegc: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/orog/C384.mx025_frac/fix_sfc/C384.vegetation_greenness.tileX.nc + None
INFO namsfc: fnvetc: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/orog/C384.mx025_frac/fix_sfc/C384.vegetation_type.tileX.nc + None
INFO namsfc: fnsotc: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/orog/C384.mx025_frac/fix_sfc/C384.soil_type.tileX.nc + None
INFO namsfc: fnsmcc: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/am/global_soilmgldas.statsgo.t766.1536.768.grb + ../../../../../nems/role.epic/UFS_SRW_data/v2p0/fix/fix_am/global_soilmgldas.t126.384.190.grb
INFO namsfc: fnmskh: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/am/global_slmask.t1534.3072.1536.grb + ../../../../../nems/role.epic/UFS_SRW_data/v2p0/fix/fix_am/seaice_newland.grb
INFO namsfc: fntsfa: - +
INFO namsfc: fnvmnc: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/orog/C384.mx025_frac/fix_sfc/C384.vegetation_greenness.tileX.nc + None
INFO namsfc: fnvmxc: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/orog/C384.mx025_frac/fix_sfc/C384.vegetation_greenness.tileX.nc + None
INFO namsfc: fnslpc: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/orog/C384.mx025_frac/fix_sfc/C384.slope_type.tileX.nc + None
INFO namsfc: fnabsc: - /scratch2/BMC/rem/Lisa.Bengtsson/MJO_MC_C384/fix/orog/C384.mx025_frac/fix_sfc/C384.maximum_snow_albedo.tileX.nc + None
INFO namsfc: _start_index: - {'fsmcl': [2]} + None
INFO amip_interp_nml: data_set: - None + reynolds_oi
INFO amip_interp_nml: date_out_of_range: - None + climo
INFO amip_interp_nml: interp_oi_sst: - None + True
INFO amip_interp_nml: no_anom_sst: - None + False
INFO amip_interp_nml: use_ncep_ice: - None + False
INFO amip_interp_nml: use_ncep_sst: - None + True
INFO cires_ugwp_nml: knob_ugwp_azdir: - None + [2, 4, 4, 4]
INFO cires_ugwp_nml: knob_ugwp_doaxyz: - None + 1
INFO cires_ugwp_nml: knob_ugwp_doheat: - None + 1
INFO cires_ugwp_nml: knob_ugwp_dokdis: - None + 1
INFO cires_ugwp_nml: knob_ugwp_effac: - None + [1, 1, 1, 1]
INFO cires_ugwp_nml: knob_ugwp_ndx4lh: - None + 1
INFO cires_ugwp_nml: knob_ugwp_solver: - None + 2
INFO cires_ugwp_nml: knob_ugwp_source: - None + [1, 1, 0, 0]
INFO cires_ugwp_nml: knob_ugwp_stoch: - None + [0, 0, 0, 0]
INFO cires_ugwp_nml: knob_ugwp_version: - None + 0
INFO cires_ugwp_nml: knob_ugwp_wvspec: - None + [1, 25, 25, 25]
INFO cires_ugwp_nml: launch_level: - None + 27
INFO namsfc_dict: fnabsc: - None + ../fix_lam/C403.maximum_snow_albedo.tileX.nc
INFO namsfc_dict: fnalbc: - None + ../fix_lam/C403.snowfree_albedo.tileX.nc
INFO namsfc_dict: fnalbc2: - None + ../fix_lam/C403.facsf.tileX.nc
INFO namsfc_dict: fnslpc: - None + ../fix_lam/C403.slope_type.tileX.nc
INFO namsfc_dict: fnsotc: - None + ../fix_lam/C403.soil_type.tileX.nc
INFO namsfc_dict: fntg3c: - None + ../fix_lam/C403.substrate_temperature.tileX.nc
INFO namsfc_dict: fnvegc: - None + ../fix_lam/C403.vegetation_greenness.tileX.nc
INFO namsfc_dict: fnvetc: - None + ../fix_lam/C403.vegetation_type.tileX.nc
INFO namsfc_dict: fnvmnc: - None + ../fix_lam/C403.vegetation_greenness.tileX.nc
INFO namsfc_dict: fnvmxc: - None + ../fix_lam/C403.vegetation_greenness.tileX.nc
Our DTC CCPP colleague Man Zhang is taking a look at this.
I suggest using npz=65 to run SRW apps by changing npz=64 in ufs-srweather-app/parm/input.nml.FV3. Note that changing npz only will have a ptop~0.02hPa
If you want to try RRFSL65 (with lower ptop~2hPa) , add the following two lines in input.nml.FV3: npz_type = 'input' fv_eta_file = 'global_hyblev_fcst_rrfsL65.txt'
npz=65 change only is sufficient to get reasonable P8 results.
@mzhangw Thanks you for your suggestion.
I conducted an experiment where input.yaml is set up as such:
npz = 65
npz_type = 'input'
fv_eta_file = 'global_hyblev_fcst_rrfsL65.txt'
The run at Derecho failed with error msg like : FATAL from PE 8: FV3 top higher than NCEP/GFS
The config.yaml is modified from ufs/ufs-srweather-app/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v17_p8_plot.yaml
The modifications include 1) not write out restart files, 2) change initial date to 2019-06-15, and 3) add the following line: FV3_NML_YAML_CONFIG_FN: /glade/work/clu/ufs/ufs-srweather-app/parm/FV3.npz.input.yml (where the npz-related changes are specified)
What other changes are needed for enable L65 run? Thanks. --Sarah
try external_eta=.false., so ak, bk are read from the file see my run on Hera /scratch1/BMC/gmtb/Man.Zhang/_SRW2_2_P8/expt_dirs/test_GFS_v17_p8_3/2019061518
On Wed, Jan 31, 2024 at 6:35 PM ch_sarah @.***> wrote:
@mzhangw https://github.com/mzhangw Thanks you for your suggestion. I conducted an experiment where input.yaml is set up as such: npz = 65 npz_type = 'input' fv_eta_file = 'global_hyblev_fcst_rrfsL65.txt'
The run at Derecho failed with error msg like : FATAL from PE 8: FV3 top higher than NCEP/GFS
The config.yaml is modified from ufs/ufs-srweather-app/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v17_p8_plot.yaml
The modifications include 1) not write out restart files, 2) change initial date to 2019-06-15, and 3) add the following line: FV3_NML_YAML_CONFIG_FN: /glade/work/clu/ufs/ufs-srweather-app/parm/FV3.npz.input.yml (where the npz-related changes are specified)
What other changes are needed for enable L65 run? Thanks. --Sarah
— Reply to this email directly, view it on GitHub https://github.com/ufs-community/ufs-srweather-app/issues/1004#issuecomment-1920318422, or unsubscribe https://github.com/notifications/unsubscribe-auth/AG7TW2WYWQS6ESRXZU4AVOTYRLWMZAVCNFSM6AAAAABCCNSG7SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRQGMYTQNBSGI . You are receiving this because you were mentioned.Message ID: @.***>
@SarahLu-NOAA Just wanted to check in to see how is your P8 with SRW v2.2 run going.
@mzhangw Derecho is down for 3 days. Will check it once it's back on. Thanks.
@mzhangw I modified parm/FV3.input.yml to have the following npz = 65 external_eta=.false. npz_type = 'input' fv_eta_file = 'global_hyblev_fcst_rrfsL65.txt'
The run failed with error msg: FATAL from PE 1: check_nml_error in fms_mod: Unknown namelist, or mistyped namelist variable in namelist fv_core_nml, (IOSTAT = 64 )
I compare your input.nml versus my input.nml and found some differences such as agrid_vel_rst, d2_bg_k2, and delt_max. Since we ran the SRW at different platform (Hera vs Derecho), different job card related parameters are expected. But it seems that there are some difference in how the exp is set up. My config.yaml is modified from sample config: config.grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v17_p8_plot.yaml from /ufs-srweather-app/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/. The modifications include 1) skip restart, 2) change the date from 2019-07-01 to 2019-06-15. I'm just wondering whether your exp is based on the same sample config. Thanks
@SarahLu-NOAA I did not use WE2E to run my experiment. I will try it next week.
@mzhangw Thanks. If you review earlier posts, the run with sample v17_p8 leads to unrealistic cold temp within hours. @ulmononian was able to reproduce the temp discrepancy.
@SarahLu-NOAA My GFS_v17_p8 run using WE2E on derecho is at /glade/work/manzhang/ufs/expt_dirs/grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v17_p8_plot You can find the config.yaml which indicates to use /glade/work/manzhang/ufs/ufs-srweather-app/parm/FV3.p8.input.yml for P8 compatible namelist options. The f006 T2m looks reasonable.
Note that I use the default npz=64 instead of RRFSL65. It seems that the difference between FV3.p8.input.yml and FV3.input.yml is the key for reasonable P8 runs.
@mzhangw Thanks, but I can't access your directory/yaml files. Could you please grant me to access it.
@SarahLu-NOAA -
PR #1055 has just been merged into the develop branch. This PR applies the recommended changes from @mzhangw to the parm/FV3.input.yml
file for FV3_GFS_v17_p8. As you can see from the plots in the PR, the cold bias that you reported in this issue have been removed. Please try applying the modifications that @EdwardSnyder-NOAA made in PR #1055 and see if this corrects what you are encountering in the SRW v2.2.0 release. If it does, I will close this issue.
Thank you very much.
@MichaelLueken I adopt the parm/FV3.input.yml modified by @mzhangw The unrealistic cooling in V17_p8 is fixed. Many thanks for the PR by @EdwardSnyder-NOAA. You may close the ticket now.
Completed with PR #1055.
Expected behavior
I conducted 4 SRW v2.2 experiments at Derecho. The config is: 12 m over NE US, LBCs/ICs from GFS, DT_ATMOS set to 60 and FCST_LEN_HRS set to 84. v16 uses out-of-the-box GFS_v16 suite v17 uses out-of-the-box GFS_v17_p8 suite v16x uses GFS_v16 with MERRA2 climo in RRTMg (iaer changed from 5111 to 1011) v17x uses GFS_v17_p8 with dt_inner changed from 150 (the default) to 30 (half of DT_ATMO)
The 4 runs are expected to produce similar results for 24-hour simulations.
Current behavior
The two v17_p8 results are way too cold.
Steps To Reproduce
config.yaml for v17x can be found at /glade/work/clu/ufs/expt_dirs/neus_12km_gfsv17_gfs4lbc_cldmp
Additional Information
While the SRWv2.2 with GFS_v17_p8 do not abort, the results are unrealistic. I understand that the public release only support GFS_v16. Just wonder whether the UFS/SRW experts can help with this issue. Thanks