ECCC-CCCS / CMIP6-CanDCS-Quality-Control

Code and script repository for QAQC work on downscaled CMIP6
1 stars 1 forks source link

Bias correcting precip extremes #34

Open laura-vanvliet opened 2 years ago

laura-vanvliet commented 2 years ago

Something Emilia stumbled across for precipitation extremes: bias correction is showing problems outside of the training period. For many locations, it seems to adjust over the reference period only, or values in the reference period are being clipped to set value? For example, for rx1day (annual max total 1-day precip) plots are shown below, with values in the training period never exceeding 97.65 mm/day. From what we can tell, this is also present in CMIP5 BCCAQ, but clipping at 10-90 percentiles removes the outliers from the plots on climatedata.ca, although they're still likely influencing overall trend. @cpomer10 @ssobie @JeremyFyke by_model_colour 10-90-ch

aulemahal commented 2 years ago

Hi!

I have not looked at BCCAQ data, but I did find a similar problem in some work I did with CMIP6 data. In my case I am adjusting with a detrended quantile mapping algorithm (as xclim implements it), the reference is ERA5-Land. Looking at the same variable (RX1day) and the same location, I saw the same kind of problem:

image

Well, in my case it even worse, I agree. And I found the reason for this. In xclim's version of the quantile mapping algorithms, the default quantile range is made of equally-spaced nodes (bin centers) PLUS 2 endpoints at 1e-6 and 1-1e-6. In my case, with the grouping I use in adjustment, those two endpoints are basically the min and the max. And this I overfit by clipping the adjusted scenario to the range of the reference within the historical period. It only needs a few extremes in the future that are larger then the maximum simulated value in the historical period to give this impression of a clash between the two periods in the scenario.

Removing the endpoints from the quantile range resolved my problem and the timeseries are now much more pleasant visually.

I have no idea my reasoning is applicable to BCCAQ, but thought it may help you in your quest.

laura-vanvliet commented 2 years ago

Thanks @aulemahal! This is helpful

laura-vanvliet commented 2 years ago

We've had some insights from the FWI project about the overfitting issue. Even using endpoints of 0.01 and 0.99, we still saw some overfitting in the training period, see first example figure below. At the suggestion of Alex Cannon, we tried using different ensemble members for calibration versus adjustment, which fixed the issue. Using the same ensemble member can artificially suppress internal variability (see Cannon et al. 2021 and references therein).

Training and adjustment applied to same ensemble member: image

Training and adjustment using separate ensemble members: image @aulemahal @ssobie @JeremyFyke @cpomer10

ssobie commented 2 years ago

Thanks Laura, this is interesting! Unfortunately we only have one realization for 25 of the 26 downscaled GCMs at present. It is something we could consider for the 10 realizations of CanESM5 potentially.

JeremyFyke commented 2 years ago

@tlogan2000 @juliettelavoie @aulemahal @laura-vanvliet - I wonder: do we need to avoid any/all BCCAQv2 CMIP6 delta calculations, for indices that exhibit QDM validation-based suppression of variability over the historical period? Reason: the delta will contain a potentially significant artifact related to the suppression. Imagine, for example, a future-historical delta of RX1day based on @aulemahal example above. The vast majority of that delta would actually be due to the suppression issue, rather than the true climate delta. Thoughts welcome.

juliettelavoie commented 2 years ago

Good question @JeremyFyke ! I think this supression is more obvious for pr, but probably exists also for temperature... How do we choose which indices to remove? It would be worth discussing at the next data working group.