iluise / atmorep_analysis

MIT License
0 stars 1 forks source link

This project collects examples of analysis scripts for each application supported by AtmoRep. Please refer to the model page for instructions on how to run the model.

File format

The output of the evaluation step is a set of .zarr files. Example:

AtmoRep data interface

The read_atmorep_data.py package contains all the functionalities to read the AtmoRep data. The output of read_data is a set of xarrays.

from utils.read_atmorep_data import HandleAtmoRepData

ar_data    = HandleAtmoRepData(model_id, input_dir)
da_target  = ar_data.read_data(field, "target", ml = levels)
da_pred    = ar_data.read_data(field, "pred"  , ml = levels)

the output is interfaced with the metrics.py package, which contains a set of common metrics to evaluate the model performance.

from utils.metrics import Scores, calc_scores_item
metrics   = ["rmse", "acc", "l1", "bias"]
atmorep_scores = calc_scores_item(da_pred, da_target, metrics, avg = ["lat", "lon"])

in this case the scores will be averaged over latitude and longitude. Please use avg = None to avoid averaging or avg = all if you want to average over all dimensions.

Applications

Here below you can find a brief description of the examples available for each application. This is a WIP and new use cases will be added shortly.

Training protocol

This package, contained within the trainings folder contains a plotting routine to inspect your trainings. It supports the computation of all the metrics defined within metrics:

Weather forecasting

The forecasting analysis is in the forecasting folder. This is intended to be an example of how to compare AtmoRep data -- computed with the global_forecasting option in evaluate.py -- against e.g. PanguWeather data obtained with the ai-models interface provided by ECMWF.

Attention

The folder attention contains the code to inspect the attention scores. The values can be obtained by setting attention = True at evaluation stage in evaluate.py. The code 'plot_attention.py' plots the attention maps from the *_attention.zarr file.