xarray-contrib / xskillscore

Metrics for verifying forecasts
https://xskillscore.readthedocs.io/en/stable/
Apache License 2.0
224 stars 40 forks source link

Feature request: Skill Scores #49

Open ahuang11 opened 4 years ago

ahuang11 commented 4 years ago

https://confluence.ecmwf.int/display/FUG/12.B+Statistical+Concepts+-+Probabilistic+Data I think it's just: image

def brier_skill_score(obs, fct, ref, threshold):
    bscore_fct = (xs.brier_score(obs > threshold, (fct > threshold).mean('member')) 
    bscore_ref = (xs.brier_score(obs > threshold, (ref > threshold).mean('member')) 
    bskill = 1 - (bscore_fct / bscore_ref)
    return bskill

where ref can be persistence or climatology (where each year of the climatology is a member)

raybellwaves commented 4 years ago

You can see how @matteodefelice did it in his notebook (https://colab.research.google.com/drive/1wWHz_SMCHNuos5fxWRUJTcB6wqkTJQCR)

brier_score = xs.brier_score(obs > obs.quantile(2/3, dim='time'),
                            (fct > fct.quantile(2/3, dim='time')).mean("member"))
baseline = xs.brier_score(obs > obs.quantile(2/3, dim='time'), 2/3)
bss = 1 - (brier_score / baseline)
ahuang11 commented 4 years ago

Yeah, I think both ways are valid, just a different baseline; maybe his can be used as the default if ref=None

def brier_skill_score(obs, fct, threshold, ref=None):
    bscore_fct = (xs.brier_score(obs > threshold, (fct > threshold).mean('member'))
    if ref is None:
        bscore_ref = xs.brier_score(obs > obs.quantile(2/3, dim='time'), 2/3)
    else:
        bscore_ref = (xs.brier_score(obs > threshold, (ref > threshold).mean('member')) 
    bskill = 1 - (bscore_fct / bscore_ref)
    return bskill
aaronspring commented 4 years ago

I think we should somehow come up with a pattern/wrapper that translates every skill into a skill score.

The formula would be from https://www.cawcr.gov.au/projects/verification/

Image 16 08 20 at 12 34

anyone ideas how to do this with few lines of code?

ahuang11 commented 4 years ago

Is it applicable across all scores though? I am only familiar with brier skill score (first time heard of RMSE skill score)

aaronspring commented 4 years ago

I think so. In climpred we have a metrics class with attributes min max perfect. Maybe this would be nice to move to xskillscore

bradyrx commented 4 years ago

I think this sort of thing should be at climpred. a and b in xskillscore are two time series for instance. Let's say a is the forecast and b the verification product. To get BSS, you'd evaluate a against b, and then a persistence derivation of b against b for instance. Since we manage reference forecasts at climpred I'd think over there we would just call xskillscore BS twice.

bradyrx commented 4 years ago

I guess with the addition of the sign test, this could still fit here. If you hand xskillscore two different forecasts (one being the dynamical and the other the reference) you can compute skill scores. climpred would then wrap that to do nice looping through all the leads.

aaronspring commented 4 years ago

this is could be a template: https://github.com/bradyrx/climpred/blob/8a3fc954df2043f998987a1964059a9dc0d2e11c/climpred/metrics.py#L128-L140 were we would need the perfect attribute.

But many we should overdo things. xs is perfect for doing easy functions xs.metric(a,b,dim). skill scores can actually really easy be calculated from this manually. I dont see a large computational benefit of implementing this compared to having this sequentially manual_scoring_function(xs.metric(a,b,dim), reference(*args,**kwargs)).

We at least shouldnt change the existing API when implementing this.

raybellwaves commented 3 years ago

See https://github.com/xarray-contrib/xskillscore/issues/284. Leaving open as good discussion here.

matteodefelice commented 3 years ago

By the way, it seems that the line:

baseline = xs.brier_score(obs_final >  obs_final.quantile(2/3, dim = 'time'), 2/3)

doesn't work any more, I get this error on my Colab:

AttributeError                            Traceback (most recent call last)

<ipython-input-68-ad9dcdfd6ce1> in <module>()
----> 1 baseline = xs.brier_score(obs_final >  obs_final.quantile(2/3, dim = 'time'), 2/3)

/usr/local/lib/python3.8/site-packages/xskillscore/core/probabilistic.py in brier_score(observations, forecasts, member_dim, fair, dim, weights, keep_attrs)
    348                 res = (e / M - o) ** 2 - e * (M - e) / (M ** 2 * (M - 1))
    349         else:
--> 350             if member_dim in forecasts.dims:
    351                 forecasts = forecasts.mean(member_dim)
    352             res = xr.apply_ufunc(

AttributeError: 'float' object has no attribute 'dims'

EDIT: yes, now it requires an xarray object and it doesn't work any more with numbers.

aaronspring commented 3 years ago

Thanks for reporting @matteodefelice Can you please open an issue with this? Conversion from number to xr.dataarray can be easily added then

matteodefelice commented 3 years ago

Has anyone an example on how to calculate the skill score for the RPS or CRPS? I still haven't found the way to do that...

aaronspring commented 3 years ago

We have crpss in climpred.

aaronspring commented 3 years ago

For crpss look into the proper scoring example

raybellwaves commented 3 years ago

@matteodefelice I believe rpss is (kind of) in climpred as well https://github.com/pangeo-data/climpred/blob/main/climpred/tests/test_probabilistic.py#L252. crpss is here: https://github.com/pangeo-data/climpred/blob/main/climpred/metrics.py#L2176 Porting here would be appreciated.

aaronspring commented 3 years ago

I do RPSS from RPS here: https://renkulab.io/gitlab/aaron.spring/s2s-ai-competition-bootstrap/-/blob/master/notebooks/verification_RPSS.ipynb

matteodefelice commented 3 years ago

Thanks @aaronspring but the real difficulty for me is calculating the RPS of the climatology, but I will find a way. Thanks to everyone for sharing.

aaronspring commented 3 years ago

generalized for perfect-models and hindcasts.

https://github.com/pangeo-data/climpred/blob/14f7458c1f2e944990d7008001adab49480fe07d/climpred/reference.py#L106

I create a fake 1-member forecast from groupby(dayofyear).

matteodefelice commented 3 years ago

that's a good idea. I did it, and the results seem reasonable. Given that I am working with seasonal averages (one point per year), this is what I have done:

cat_edges = obs.quantile(q = [1/3, 2/3], dim = 'year').rename({'quantile':'category_edge'})
rps_clim = xskillscore.rps(obs, obs.mean(dim = 'year').expand_dims({'member':1}), cat_edges, dim = 'year', member_dim='member')

The RPSS is comparable with the BSS, so I think this should work.

aschl commented 11 months ago

Just came across this package. Nice work. What is the relationship between climpred and xskillscore? Some metrics are only available in climpred and are missing in xskillscore. A function to calculate the skillscore would be actually really nice (even if it was just the basic function that @aaronspring mentioned here.

aaronspring commented 11 months ago

Relationship: climpred imports (nearly?) all xskillscore metrics by wrapping xskillscore. climpred might have some metrics also defined only in climpred, such as skillscores for some metrics, i.e. normalised mse as mse/std or msess = 1 - nmse

aschl commented 11 months ago

Any objections in including these missing metrics in xskillscore? Seems to me the better place than having two packages with mixed functionalities. All benchmarking related scoring functions should be available in xskillscore IMO (and not climpred). Re skill score (the topic of this issue): I was a bit surprised that the package (which is called xskillscore) does not include a function to calculate skillscores. It's a standard benchmarking metric that seems to me very suitable for this package?!

raybellwaves commented 11 months ago

Any objections in including these missing metrics in xskillscore? Seems to me the better place than having two packages with mixed functionalities. All benchmarking related scoring functions should be available in xskillscore IMO (and not climpred). Re skill score (the topic of this issue): I was a bit surprised that the package (which is called xskillscore) does not include a function to calculate skillscores. It's a standard benchmarking metric that seems to me very suitable for this package?!

No objections to additional metrics here that are missing and in climpred. Just make sure they give the same result

aaronspring commented 11 months ago

I think we wanted to keep the api consistent for all metrics: metric(forecast, observation) for skillscore you'd need to pass a reference skill somehow. I'm for that and suggest to have this api discussion first.