Closed jetesdal closed 6 months ago
@Timh37 handled this -- Tim, could you please address?
Also please check to see if underlying files on ESGF have been updated since 9/2020. It's quite possible that the difference you're seeing reflects file changes.
It appears UKESM was not used in the ocean dynamic sea level calculation, because the version of the data available as of our 9/2020 cutoff window was giving weird results on the IBE correction (SSP585, 2081-2100 minus 1986-2005). Here is the IBE correction from the available models:
It is also possible this has not been corrected, but you are not seeing it because you are not applying the IBE correction. Here is the corresponding dedrifted zos fields, prior to IBE correction:
Thank you for the clarification and the insights @bobkopp. I wasn't aware that the IBE correction was already included in the zos dataset. Yes, it looks like the IBE (i.e. zos+psl) explains the issue with the UKESM1-0-LL.
I think the very first step in addressing #129 and #171 (and I presume also #252 ?) would be to demonstrate reproducibility of the current module-data version. The current set of maps I gathered are from FACTS module-data and I added a set of GFDL models.
I figured out a way to do the dedrifting and regridding for the additional model output, but I would need to also do the IBE corrections. It would be great to have a detailed account on the preprocessing steps applied to the CMIP6 output for the existing module-data version. @Timh37, is that something you could share?
The preprocessing steps are detailed in the SM of Chapter 9 (including the exclusion of UKESM1-0-LL). I haven't uploaded the preprocessing scripts in a central place yet but the same or at least very similar scripts as used for AR6 are available here and here.
If I'm not mistaken the code for the IBE correction for AR6 was originally written by Greg, see below.
Probably these scripts could be improved and it would be nice to integrate all of these into FACTS at some point, but I don't really have the time for this at the moment.
import numpy as np
import os
import iris
import sys
import argparse
import fnmatch
import re
'''
Apply an inverse-barometer-effect correction to ZOS data.
See below paper for details:
Piecuch, C. G., and Ponte, R. M. (2015),
Inverted barometer contributions to recent sea level changes along the northeast coast
of North America, Geophys. Res. Lett., 42, 5918–5925, doi:10.1002/2015GL064580.
Note: In order to run this code, activate the "esmpy" conda environment. Required for
access to the "iris" module.
conda activate esmpy
'''
def guess_bounds(cube, coords):
"""Guess bounds of a cube, or not."""
# check for bounds just in case
for coord in coords:
if not cube.coord(coord).has_bounds():
cube.coord(coord).guess_bounds()
return cube
def calc_area_weights(cube):
# Calculate area weights from the provided grid
cube = guess_bounds(cube, ['longitude', 'latitude'])
areaweights = iris.analysis.cartography.area_weights(cube)
# Apply the PSL mask to the area weights
areaweights = np.ma.masked_where(cube.data.mask, areaweights)
# Normalize the area weights
norm_area = areaweights / np.sum(areaweights, axis=(1,2))[:,None,None]
return(norm_area[0,...][None,...])
def gravity(latitudes):
# WGS (World Geodetic System) 84 Ellipsoidal Gravity Formula
# https://en.wikipedia.org/wiki/Gravity_of_Earth#Latitude_model
# Define constants
a = 6378137.0 # Equatorial semi-axis (m)
b = 6356752.3 # Polar semi-axis (m)
ge = 9.780327 # Equatorial gravitational acceleration (m s-2)
gp = 9.8321849378 # Polar gravitational acceleration (m s-2)
# Derive other constants
e_squared = 1.0 - (b/a)**2 # Spheroid's eccentricity
k = (b*gp - a*ge) / (a*ge) # Formula constant
# Convert latitudes from degrees to radians
lats = latitudes * np.pi/180.0
# Calculate the latitude-dependent gravity
g_num = 1 + k*np.sin(lats)**2
g_denom = np.sqrt(1 - e_squared * np.sin(lats)**2)
g_lat = ge * (g_num/g_denom)
# Return gravity
return(g_lat)
def main(zosdir, psldir, outdir):
# Flag for calculating area weights
calc_weights_flag = True
# Loop over the models available in the ZOS directory
#for model in os.listdir(zosdir):
#for model in (model for model in os.listdir(zosdir) if model not in ['.DS_Store']):
for model in (model for model in os.listdir(zosdir) if model=='UKESM1-0-LL'):
# Skip if this isn't really a model
if re.search(r"^\.", model):
continue
# Define this ZOS and PSL model directories
zos_model_dir = os.path.join(zosdir, model)
psl_model_dir = os.path.join(psldir, model)
# DEBUG: Which model are we on?
print(model)
# Reset the area weight flag if you want to calculate new area weights for each model
# Note: Comment this out if you want to assume the same area weights across all models
calc_weights_flag = True
# Loop over the files available for this model
for zos_file in sorted(os.listdir(zos_model_dir)):
# Extract the pieces of the file name
pieces = zos_file.split("_")
this_model = pieces[2]
this_exp = pieces[3]
this_variant = pieces[4]
this_grid = pieces[5]
this_time = pieces[6]
# Is there a matching model available in the PSL data?
try:
psl_files = os.listdir(psl_model_dir)
except:
print("Cannot open PSL model directory: {}. Moving on...".format(psl_model_dir))
continue
# Is there a matching PSL file for this model?
match_psl_file = fnmatch.filter(psl_files, "psl_Amon_{0}_{1}_{2}_*_{3}*.nc".format(this_model, this_exp, this_variant, this_time))
if not match_psl_file:
print("Could not find matching PSL file for {}. Moving on...".format(zos_file))
continue
# A match exists. Load the cubes.
cube_zos = iris.load_cube(os.path.join(zos_model_dir, zos_file))
cube_psl = iris.load_cube(os.path.join(psl_model_dir, match_psl_file[0]))
# Find the ZOS mask and apply to the PSL data
masked_psl = np.ma.masked_where(np.isnan(cube_zos.data), cube_psl.data)
cube_psl = cube_psl.copy(data=np.float32(masked_psl))
# Calculate the normalized area weights if needed
if calc_weights_flag:
norm_area = calc_area_weights(cube_psl)
calc_weights_flag = False
# Remove the area-weighted mean from the psl data
weighted_cube = cube_psl * norm_area
psl_mean_cube = weighted_cube.collapsed(["latitude", "longitude"], iris.analysis.SUM)
psl_anom_cube = cube_psl - psl_mean_cube
# Calculate the IBE correction
rho = 1025.0
g = 9.80665
#g = gravity(cube_psl.coord("latitude").points)[None,:,None] # gravity as function of latitude
ibe = -1 * (psl_anom_cube.data) / (rho*g)
# Apply the correction to the ZOS field
cube_zos_ibe = cube_zos + ibe
# Make a copy of the original zos cube, update it with the new zos data, and write it to netcdf
if not os.path.exists(os.path.join(outdir, model)):
os.mkdir(os.path.join(outdir, model))
outfile = os.path.join(outdir, model, re.sub(r"\.nc", "_ibe.nc", zos_file))
#iris.save(cube_zos.copy(data=np.float32(cube_zos_ibe.data)), outfile, zlib=True, complevel=4,fill_value=np.nan)
# Done
return(None)
if __name__ == "__main__":
'''
# Initialize the argument parser
parser = argparse.ArgumentParser(description="Apply the Inverse Barometer Effect Correction to ZOS data")
# Add arguments
parser.add_argument('--zosdir', help="Directory containing the ZOS model-separated data", required=True)
parser.add_argument('--psldir', help="Directory containing the PSL model-separated data", required=True)
parser.add_argument('--outdir', help="Output directory", required=True)
# Parse the arguments
args = parser.parse_args()
'''
# Run the code (args.psldir)
main(<zos_dir>,<psl_dir>,<output_dir>)
@jetesdal I've put the code from @Timh37 here: https://github.com/radical-collaboration/facts/tree/development/modules/tlm/sterodynamics/2lmfit
I think we've solved the original question, given the known issue with the UKESM IBE correction, but let me know if there are other bits and pieces you need that aren't in the repo now.
Yes, I've got all the info I was looking for. Thank you very much for your help @bobkopp and @Timh37 for the clarifications and pointing me to all the sources.
I went ahead and applied the IBE correction for a subset of models and it looks like that the UKESM1-0-LL is now giving a realistic IBE with the latest psl
dataset downloaded from the CMIP6 archive.
I came across some zos fields in
tlm_sterodynamics_preprocess_data.tgz
that are not matching with the version I downloaded from the CMIP6 archive. I am not sure if it is because the zos field was updated in the archive or the preprocessing step (e.g., de-drifting or re-gridding to 1x1) have created the mismatch.Below is an example showing the linear trend of zos for the period 1993 to 2020 using the historical and the ssp585 zos files found in
tlm_sterodynamics_preprocess_data.tgz
. You can see that the trend for UKESM1-0-LL look unrealistic.In this case, the underlying reason is that the values between the historical and ssp585 dataset do not align well, as seen from the time series of the global mean of zos:
These artifacts are not present in the recently downloaded CMIP6 zos output, which suggests they might have been introduced during the preprocessing stage. Are there any scripts and/or documentation available for the CMIP6 processing involved in creating the input data in
tlm_sterodynamics_cmip6_data.tgz
?I think having the scripts and tools would also help in addressing issue #129 and would allow a user to update the input data with more recent CMIP6 output or including post-CMIP6 model output.