bopen / c3s-eqc-toolbox-template

CADS Toolbox template application
Apache License 2.0
5 stars 4 forks source link

Jupyter Notebook to assess the spatiotemporal variability of rain-on-snow events over Svalbard #124

Closed FabioMangini closed 6 months ago

FabioMangini commented 9 months ago

Notebook description

I have started creating a Jupyter Notebook to analyze the spatiotemporal evolution of rain-on-snow events over Svalbard during the period covered by the C3S Arctic Regional Reanalysis (CARRA) dataset (available at https://cds.climate.copernicus.eu/cdsapp#!/dataset/reanalysis-carra-single-levels?tab=overview).

The Jupyter Notebook will be partly based on daily values of 2m temperature and total precipitation in winter (December-January-February) over Svalbard from the:

For the time being, I have downloaded:

I also tried to download the total precipitation from CARRA west over the same period as for CARRA east, but I had an issue. It seems that I can only download data up to December 1997. When I tried to download data for the following year, the Jupyter Notebook seemed to run indefinitely: I waited for many hours, but I did not manage to download the total precipitation from CARRA west for 1998. I was wondering whether you could help me solve this issue.

Notebook link or upload

rain_on_snow_climate_change_to_be_checked.ipynb.zip

Anything else we need to know?

No response

Environment

name: wp5 channels: - conda-forge dependencies: - _libgcc_mutex=0.1=conda_forge - _openmp_mutex=4.5=2_gnu - affine=2.4.0=pyhd8ed1ab_0 - aiohttp=3.9.1=py311h459d7ec_0 - aiosignal=1.3.1=pyhd8ed1ab_0 - alsa-lib=1.2.10=hd590300_0 - annotated-types=0.6.0=pyhd8ed1ab_0 - ansiwrap=0.8.4=py_0 - antlr-python-runtime=4.11.1=pyhd8ed1ab_0 - anyio=4.2.0=pyhd8ed1ab_0 - argon2-cffi=23.1.0=pyhd8ed1ab_0 - argon2-cffi-bindings=21.2.0=py311h459d7ec_4 - arrow=1.3.0=pyhd8ed1ab_0 - asciitree=0.3.3=py_2 - asttokens=2.4.1=pyhd8ed1ab_0 - async-lru=2.0.4=pyhd8ed1ab_0 - attr=2.5.1=h166bdaf_1 - attrs=23.1.0=pyh71513ae_1 - aws-c-auth=0.7.8=h538f98c_2 - aws-c-cal=0.6.9=h5d48c4d_2 - aws-c-common=0.9.10=hd590300_0 - aws-c-compression=0.2.17=h7f92143_7 - aws-c-event-stream=0.3.2=h0bcb0bb_8 - aws-c-http=0.7.14=hd268abd_3 - aws-c-io=0.13.36=he0cd244_2 - aws-c-mqtt=0.10.0=h35285c7_0 - aws-c-s3=0.4.5=h0448019_0 - aws-c-sdkutils=0.1.13=h7f92143_0 - aws-checksums=0.1.17=h7f92143_6 - aws-crt-cpp=0.25.0=h1bbe558_2 - aws-sdk-cpp=1.11.210=h0853bfa_5 - azure-core-cpp=1.10.3=h91d86a7_0 - azure-storage-blobs-cpp=12.10.0=h00ab1b0_0 - azure-storage-common-cpp=12.5.0=hb858b4b_2 - babel=2.14.0=pyhd8ed1ab_0 - beautifulsoup4=4.12.2=pyha770c72_0 - black=23.11.0=py311h38be061_0 - bleach=6.1.0=pyhd8ed1ab_0 - blosc=1.21.5=h0f2a231_0 - bokeh=3.3.2=pyhd8ed1ab_0 - bottleneck=1.3.7=py311h1f0f07a_1 - branca=0.7.0=pyhd8ed1ab_1 - brotli=1.1.0=hd590300_1 - brotli-bin=1.1.0=hd590300_1 - brotli-python=1.1.0=py311hb755f60_1 - bzip2=1.0.8=hd590300_5 - c-ares=1.24.0=hd590300_0 - ca-certificates=2023.11.17=hbcca054_0 - cached-property=1.5.2=hd8ed1ab_1 - cached_property=1.5.2=pyha770c72_1 - cairo=1.18.0=h3faef2a_0 - cartopy=0.22.0=py311h320fe9a_1 - cdsapi=0.6.1=pyhd8ed1ab_0 - certifi=2023.11.17=pyhd8ed1ab_0 - cf-units=3.2.0=py311h1f0f07a_4 - cf_xarray=0.8.7=pyhd8ed1ab_0 - cffi=1.16.0=py311hb3a22ac_0 - cfgrib=0.9.10.4=pyhd8ed1ab_0 - cfitsio=4.3.1=hbdc6101_0 - cftime=1.6.3=py311h1f0f07a_0 - charset-normalizer=3.3.2=pyhd8ed1ab_0 - click=8.1.7=unix_pyh707e725_0 - click-plugins=1.1.1=py_0 - cligj=0.7.2=pyhd8ed1ab_1 - cloudpickle=3.0.0=pyhd8ed1ab_0 - colorama=0.4.6=pyhd8ed1ab_0 - comm=0.1.4=pyhd8ed1ab_0 - contourpy=1.2.0=py311h9547e67_0 - cycler=0.12.1=pyhd8ed1ab_0 - cytoolz=0.12.2=py311h459d7ec_1 - dask=2023.12.1=pyhd8ed1ab_0 - dask-core=2023.12.1=pyhd8ed1ab_0 - dbus=1.13.6=h5008d03_3 - debugpy=1.8.0=py311hb755f60_1 - decorator=5.1.1=pyhd8ed1ab_0 - defusedxml=0.7.1=pyhd8ed1ab_0 - distributed=2023.12.1=pyhd8ed1ab_0 - eccodes=2.33.0=he84ddb8_0 - entrypoints=0.4=pyhd8ed1ab_0 - esmf=8.4.2=nompi_h9e768e6_3 - esmpy=8.4.2=pyhc1e730c_4 - exceptiongroup=1.2.0=pyhd8ed1ab_0 - executing=2.0.1=pyhd8ed1ab_0 - expat=2.5.0=hcb278e6_1 - fasteners=0.17.3=pyhd8ed1ab_0 - findlibs=0.0.5=pyhd8ed1ab_0 - fiona=1.9.5=py311hf8e0aa6_2 - flox=0.8.5=pyhd8ed1ab_0 - folium=0.15.1=pyhd8ed1ab_0 - font-ttf-dejavu-sans-mono=2.37=hab24e00_0 - font-ttf-inconsolata=3.000=h77eed37_0 - font-ttf-source-code-pro=2.038=h77eed37_0 - font-ttf-ubuntu=0.83=h77eed37_1 - fontconfig=2.14.2=h14ed4e7_0 - fonts-conda-ecosystem=1=0 - fonts-conda-forge=1=0 - fonttools=4.47.0=py311h459d7ec_0 - fqdn=1.5.1=pyhd8ed1ab_0 - freeglut=3.2.2=hac7e632_2 - freetype=2.12.1=h267a509_2 - freexl=2.0.0=h743c826_0 - frozenlist=1.4.1=py311h459d7ec_0 - fsspec=2023.12.2=pyhca7485f_0 - gdal=3.8.1=py311h39b4e0e_4 - geopandas=0.14.1=pyhd8ed1ab_0 - geopandas-base=0.14.1=pyha770c72_0 - geos=3.12.1=h59595ed_0 - geotiff=1.7.1=h6b2125f_15 - gettext=0.21.1=h27087fc_0 - gflags=2.2.2=he1b5a44_1004 - giflib=5.2.1=h0b41bf4_3 - glib=2.78.3=hfc55251_0 - glib-tools=2.78.3=hfc55251_0 - glog=0.6.0=h6f12383_0 - gmp=6.3.0=h59595ed_0 - graphite2=1.3.13=h58526e2_1001 - greenlet=3.0.2=py311hb755f60_0 - gst-plugins-base=1.22.8=h8e1006c_0 - gstreamer=1.22.8=h98fc4e7_0 - harfbuzz=8.3.0=h3d44ed6_0 - hdf4=4.2.15=h2a13503_7 - hdf5=1.14.3=nompi_h4f84152_100 - icu=73.2=h59595ed_0 - idna=3.6=pyhd8ed1ab_0 - importlib-metadata=7.0.0=pyha770c72_0 - importlib_metadata=7.0.0=hd8ed1ab_0 - importlib_resources=6.1.1=pyhd8ed1ab_0 - ipykernel=6.26.0=pyhf8b6a83_0 - ipython=8.18.1=pyh707e725_3 - isoduration=20.11.0=pyhd8ed1ab_0 - jasper=4.1.0=he6dfbbe_0 - jedi=0.19.1=pyhd8ed1ab_0 - jinja2=3.1.2=pyhd8ed1ab_1 - joblib=1.3.2=pyhd8ed1ab_0 - json-c=0.17=h7ab15ed_0 - json5=0.9.14=pyhd8ed1ab_0 - jsonpointer=2.4=py311h38be061_3 - jsonschema=4.20.0=pyhd8ed1ab_0 - jsonschema-specifications=2023.11.2=pyhd8ed1ab_0 - jsonschema-with-format-nongpl=4.20.0=pyhd8ed1ab_0 - jupyter-lsp=2.2.1=pyhd8ed1ab_0 - jupyter_client=8.6.0=pyhd8ed1ab_0 - jupyter_core=5.5.1=py311h38be061_0 - jupyter_events=0.9.0=pyhd8ed1ab_0 - jupyter_server=2.12.1=pyhd8ed1ab_0 - jupyter_server_terminals=0.5.0=pyhd8ed1ab_0 - jupyterlab=4.0.9=pyhd8ed1ab_0 - jupyterlab_pygments=0.3.0=pyhd8ed1ab_0 - jupyterlab_server=2.25.2=pyhd8ed1ab_0 - kealib=1.5.2=hcd42e92_1 - keyutils=1.6.1=h166bdaf_0 - kiwisolver=1.4.5=py311h9547e67_1 - krb5=1.21.2=h659d440_0 - lame=3.100=h166bdaf_1003 - lcms2=2.16=hb7c19ff_0 - ld_impl_linux-64=2.40=h41732ed_0 - lerc=4.0.0=h27087fc_0 - libabseil=20230802.1=cxx17_h59595ed_0 - libaec=1.1.2=h59595ed_1 - libarchive=3.7.2=h2aa1ff5_1 - libarrow=14.0.2=hfb4d3a9_0_cpu - libarrow-acero=14.0.2=h59595ed_0_cpu - libarrow-dataset=14.0.2=h59595ed_0_cpu - libarrow-flight=14.0.2=h120cb0d_0_cpu - libarrow-flight-sql=14.0.2=h61ff412_0_cpu - libarrow-gandiva=14.0.2=hacb8726_0_cpu - libarrow-substrait=14.0.2=h61ff412_0_cpu - libblas=3.9.0=20_linux64_openblas - libboost-headers=1.84.0=ha770c72_0 - libbrotlicommon=1.1.0=hd590300_1 - libbrotlidec=1.1.0=hd590300_1 - libbrotlienc=1.1.0=hd590300_1 - libcap=2.69=h0f662aa_0 - libcblas=3.9.0=20_linux64_openblas - libclang=15.0.7=default_hb11cfb5_4 - libclang13=15.0.7=default_ha2b6cf4_4 - libcrc32c=1.1.2=h9c3ff4c_0 - libcups=2.3.3=h4637d8d_4 - libcurl=8.5.0=hca28451_0 - libdeflate=1.19=hd590300_0 - libedit=3.1.20191231=he28a2e2_2 - libev=4.33=hd590300_2 - libevent=2.1.12=hf998b51_1 - libexpat=2.5.0=hcb278e6_1 - libffi=3.4.2=h7f98852_5 - libflac=1.4.3=h59595ed_0 - libgcc-ng=13.2.0=h807b86a_3 - libgcrypt=1.10.3=hd590300_0 - libgdal=3.8.1=hed8bd54_4 - libgfortran-ng=13.2.0=h69a702a_3 - libgfortran5=13.2.0=ha4646dd_3 - libglib=2.78.3=h783c2da_0 - libglu=9.0.0=hac7e632_1003 - libgomp=13.2.0=h807b86a_3 - libgoogle-cloud=2.12.0=h5206363_4 - libgpg-error=1.47=h71f35ed_0 - libgrpc=1.59.3=hd6c4280_0 - libiconv=1.17=hd590300_2 - libjpeg-turbo=3.0.0=hd590300_1 - libkml=1.3.0=h01aab08_1018 - liblapack=3.9.0=20_linux64_openblas - libllvm14=14.0.6=hcd5def8_4 - libllvm15=15.0.7=hb3ce162_4 - libnetcdf=4.9.2=nompi_h9612171_113 - libnghttp2=1.58.0=h47da74e_1 - libnl=3.9.0=hd590300_0 - libnsl=2.0.1=hd590300_0 - libnuma=2.0.16=h0b41bf4_1 - libogg=1.3.4=h7f98852_1 - libopenblas=0.3.25=pthreads_h413a1c8_0 - libopus=1.3.1=h7f98852_1 - libparquet=14.0.2=h352af49_0_cpu - libpng=1.6.39=h753d276_0 - libpq=16.1=h33b98f1_7 - libprotobuf=4.24.4=hf27288f_0 - libre2-11=2023.06.02=h7a70373_0 - librttopo=1.1.0=h8917695_15 - libsndfile=1.2.2=hc60ed4a_1 - libsodium=1.0.18=h36c2ea0_1 - libspatialindex=1.9.3=h9c3ff4c_4 - libspatialite=5.1.0=h7bd4643_4 - libsqlite=3.44.2=h2797004_0 - libssh2=1.11.0=h0841786_0 - libstdcxx-ng=13.2.0=h7e041cc_3 - libsystemd0=255=h3516f8a_0 - libthrift=0.19.0=hb90f79a_1 - libtiff=4.6.0=ha9c0a0a_2 - libudunits2=2.2.28=h40f5838_3 - libutf8proc=2.8.0=h166bdaf_0 - libuuid=2.38.1=h0b41bf4_0 - libvorbis=1.3.7=h9c3ff4c_0 - libwebp-base=1.3.2=hd590300_0 - libxcb=1.15=h0b41bf4_0 - libxkbcommon=1.6.0=hd429924_1 - libxml2=2.12.3=h232c23b_0 - libzip=1.10.1=h2629f0a_3 - libzlib=1.2.13=hd590300_5 - llvmlite=0.41.1=py311ha6695c7_0 - locket=1.0.0=pyhd8ed1ab_0 - lz4=4.3.2=py311h38e4bf4_1 - lz4-c=1.9.4=hcb278e6_0 - lzo=2.10=h516909a_1000 - mapclassify=2.6.1=pyhd8ed1ab_0 - markdown-it-py=3.0.0=pyhd8ed1ab_0 - markupsafe=2.1.3=py311h459d7ec_1 - matplotlib=3.8.2=py311h38be061_0 - matplotlib-base=3.8.2=py311h54ef318_0 - matplotlib-inline=0.1.6=pyhd8ed1ab_0 - mdurl=0.1.0=pyhd8ed1ab_0 - minizip=4.0.3=h0ab5242_0 - mistune=3.0.2=pyhd8ed1ab_0 - mpg123=1.32.3=h59595ed_0 - msgpack-python=1.0.7=py311h9547e67_0 - multidict=6.0.4=py311h459d7ec_1 - munkres=1.1.4=pyh9f0ad1d_0 - mypy_extensions=1.0.0=pyha770c72_0 - mysql-common=8.0.33=hf1915f5_6 - mysql-libs=8.0.33=hca2cd23_6 - nbclient=0.8.0=pyhd8ed1ab_0 - nbconvert=7.13.0=pyhd8ed1ab_0 - nbconvert-core=7.13.0=pyhd8ed1ab_0 - nbconvert-pandoc=7.13.0=pyhd8ed1ab_0 - nbformat=5.9.2=pyhd8ed1ab_0 - nc-time-axis=1.4.1=pyhd8ed1ab_0 - ncurses=6.4=h59595ed_2 - nest-asyncio=1.5.8=pyhd8ed1ab_0 - netcdf-fortran=4.6.1=nompi_hacb5139_103 - netcdf4=1.6.5=nompi_py311he8ad708_100 - networkx=3.2.1=pyhd8ed1ab_0 - notebook-shim=0.2.3=pyhd8ed1ab_0 - nspr=4.35=h27087fc_0 - nss=3.96=h1d7d5a4_0 - numba=0.58.1=py311h96b013e_0 - numcodecs=0.12.1=py311hb755f60_0 - numpy=1.26.2=py311h64a7726_0 - numpy_groupies=0.10.2=pyhd8ed1ab_0 - openjpeg=2.5.0=h488ebb8_3 - openssl=3.2.0=hd590300_1 - orc=1.9.2=h4b38347_0 - overrides=7.4.0=pyhd8ed1ab_0 - packaging=23.2=pyhd8ed1ab_0 - pandas=2.1.4=py311h320fe9a_0 - pandoc=3.1.3=h32600fe_0 - pandocfilters=1.5.0=pyhd8ed1ab_0 - papermill=2.4.0=pyhd8ed1ab_0 - parso=0.8.3=pyhd8ed1ab_0 - partd=1.4.1=pyhd8ed1ab_0 - pathspec=0.12.1=pyhd8ed1ab_0 - patsy=0.5.4=pyhd8ed1ab_0 - pcre2=10.42=hcad00b1_0 - pexpect=4.8.0=pyh1a96a4e_2 - pickleshare=0.7.5=py_1003 - pillow=10.1.0=py311ha6c5da5_0 - pip=23.3.2=pyhd8ed1ab_0 - pixman=0.42.2=h59595ed_0 - pkgutil-resolve-name=1.3.10=pyhd8ed1ab_1 - platformdirs=4.1.0=pyhd8ed1ab_0 - plotly=5.18.0=pyhd8ed1ab_0 - ply=3.11=py_1 - pooch=1.8.0=pyhd8ed1ab_0 - poppler=23.12.0=h590f24d_0 - poppler-data=0.4.12=hd8ed1ab_0 - postgresql=16.1=h7387d8b_7 - proj=9.3.1=h1d62c97_0 - prometheus_client=0.19.0=pyhd8ed1ab_0 - prompt-toolkit=3.0.42=pyha770c72_0 - properscoring=0.1=py_0 - psutil=5.9.7=py311h459d7ec_0 - pthread-stubs=0.4=h36c2ea0_1001 - ptyprocess=0.7.0=pyhd3deb0d_0 - pulseaudio-client=16.1=hb77b528_5 - pure_eval=0.2.2=pyhd8ed1ab_0 - pwlf=2.2.1=py311h38be061_3 - pyarrow=14.0.2=py311h39c9aba_0_cpu - pyarrow-hotfix=0.6=pyhd8ed1ab_0 - pycparser=2.21=pyhd8ed1ab_0 - pydantic=2.5.2=pyhd8ed1ab_0 - pydantic-core=2.14.5=py311h46250e7_0 - pydantic-settings=2.1.0=pyhd8ed1ab_1 - pydoe=0.3.8=py_1 - pygments=2.17.2=pyhd8ed1ab_0 - pyparsing=3.1.1=pyhd8ed1ab_0 - pyproj=3.6.1=py311hca0b8b9_5 - pyqt=5.15.9=py311hf0fb5b6_5 - pyqt5-sip=12.12.2=py311hb755f60_5 - pyshp=2.3.1=pyhd8ed1ab_0 - pysocks=1.7.1=pyha2e5f31_6 - python=3.11.7=hab00c5b_0_cpython - python-dateutil=2.8.2=pyhd8ed1ab_0 - python-dotenv=1.0.0=pyhd8ed1ab_1 - python-eccodes=1.6.1=py311h1f0f07a_1 - python-fastjsonschema=2.19.0=pyhd8ed1ab_0 - python-json-logger=2.0.7=pyhd8ed1ab_0 - python-tzdata=2023.3=pyhd8ed1ab_0 - python_abi=3.11=4_cp311 - pytz=2023.3.post1=pyhd8ed1ab_0 - pyyaml=6.0.1=py311h459d7ec_1 - pyzmq=25.1.2=py311h34ded2d_0 - qt-main=5.15.8=h450f30e_18 - rasterio=1.3.9=py311ha38370a_2 - rdma-core=49.0=hd3aeb46_2 - re2=2023.06.02=h2873b5e_0 - readline=8.2=h8228510_1 - referencing=0.32.0=pyhd8ed1ab_0 - regionmask=0.11.0=pyhd8ed1ab_0 - requests=2.31.0=pyhd8ed1ab_0 - rfc3339-validator=0.1.4=pyhd8ed1ab_0 - rfc3986-validator=0.1.1=pyh9f0ad1d_0 - rich=13.7.0=pyhd8ed1ab_0 - rioxarray=0.15.0=pyhd8ed1ab_0 - rpds-py=0.15.2=py311h46250e7_0 - rtree=1.1.0=py311h3bb2b0f_0 - s2n=1.4.0=h06160fa_0 - scikit-learn=1.3.2=py311hc009520_2 - scipy=1.11.4=py311h64a7726_0 - seaborn=0.13.0=hd8ed1ab_0 - seaborn-base=0.13.0=pyhd8ed1ab_0 - send2trash=1.8.2=pyh41d4057_0 - setuptools=68.2.2=pyhd8ed1ab_0 - shapely=2.0.2=py311h2032efe_1 - shellingham=1.5.4=pyhd8ed1ab_0 - sip=6.7.12=py311hb755f60_0 - six=1.16.0=pyh6c4a22f_0 - snappy=1.1.10=h9fff704_0 - sniffio=1.3.0=pyhd8ed1ab_0 - snuggs=1.4.7=py_0 - sortedcontainers=2.4.0=pyhd8ed1ab_0 - soupsieve=2.5=pyhd8ed1ab_1 - sparse=0.14.0=pyhd8ed1ab_0 - sqlalchemy=2.0.23=py311h459d7ec_0 - sqlite=3.44.2=h2c6b66d_0 - stack_data=0.6.2=pyhd8ed1ab_0 - statsmodels=0.14.1=py311h1f0f07a_0 - structlog=23.2.0=pyhd8ed1ab_0 - tblib=3.0.0=pyhd8ed1ab_0 - tenacity=8.2.3=pyhd8ed1ab_0 - terminado=0.18.0=pyh0d859eb_0 - textwrap3=0.9.2=py_0 - threadpoolctl=3.2.0=pyha21a80b_0 - tiledb=2.18.3=hc1131af_1 - tinycss2=1.2.1=pyhd8ed1ab_0 - tk=8.6.13=noxft_h4845f30_101 - toml=0.10.2=pyhd8ed1ab_0 - tomli=2.0.1=pyhd8ed1ab_0 - toolz=0.12.0=pyhd8ed1ab_0 - tornado=6.3.3=py311h459d7ec_1 - tqdm=4.66.1=pyhd8ed1ab_0 - traitlets=5.14.0=pyhd8ed1ab_0 - typer=0.9.0=pyhd8ed1ab_0 - types-python-dateutil=2.8.19.14=pyhd8ed1ab_0 - typing-extensions=4.9.0=hd8ed1ab_0 - typing_extensions=4.9.0=pyha770c72_0 - typing_utils=0.1.0=pyhd8ed1ab_0 - tzcode=2023c=h0b41bf4_0 - tzdata=2023c=h71feb2d_0 - ucx=1.15.0=h75e419f_2 - udunits2=2.2.28=h40f5838_3 - uri-template=1.3.0=pyhd8ed1ab_0 - uriparser=0.9.7=hcb278e6_1 - urllib3=2.1.0=pyhd8ed1ab_0 - wcwidth=0.2.12=pyhd8ed1ab_0 - webcolors=1.13=pyhd8ed1ab_0 - webencodings=0.5.1=pyhd8ed1ab_2 - websocket-client=1.7.0=pyhd8ed1ab_0 - wheel=0.42.0=pyhd8ed1ab_0 - xarray=2023.12.0=pyhd8ed1ab_0 - xarraymannkendall=1.4.5=pyhd8ed1ab_0 - xcb-util=0.4.0=hd590300_1 - xcb-util-image=0.4.0=h8ee46fc_1 - xcb-util-keysyms=0.4.0=h8ee46fc_1 - xcb-util-renderutil=0.3.9=hd590300_1 - xcb-util-wm=0.4.1=h8ee46fc_1 - xerces-c=3.2.4=hac6953d_3 - xesmf=0.8.2=pyhd8ed1ab_0 - xhistogram=0.3.2=pyhd8ed1ab_0 - xkeyboard-config=2.40=hd590300_0 - xorg-fixesproto=5.0=h7f98852_1002 - xorg-inputproto=2.3.2=h7f98852_1002 - xorg-kbproto=1.0.7=h7f98852_1002 - xorg-libice=1.1.1=hd590300_0 - xorg-libsm=1.2.4=h7391055_0 - xorg-libx11=1.8.7=h8ee46fc_0 - xorg-libxau=1.0.11=hd590300_0 - xorg-libxdmcp=1.1.3=h7f98852_0 - xorg-libxext=1.3.4=h0b41bf4_2 - xorg-libxfixes=5.0.3=h7f98852_1004 - xorg-libxi=1.7.10=h7f98852_0 - xorg-libxrender=0.9.11=hd590300_0 - xorg-renderproto=0.11.1=h7f98852_1002 - xorg-xextproto=7.3.0=h0b41bf4_1003 - xorg-xf86vidmodeproto=2.3.1=h7f98852_1002 - xorg-xproto=7.0.31=h7f98852_1007 - xskillscore=0.0.24=pyhd8ed1ab_0 - xyzservices=2023.10.1=pyhd8ed1ab_0 - xz=5.2.6=h166bdaf_0 - yaml=0.2.5=h7f98852_2 - yarl=1.9.3=py311h459d7ec_0 - zarr=2.16.1=pyhd8ed1ab_0 - zeromq=4.3.5=h59595ed_0 - zict=3.0.0=pyhd8ed1ab_0 - zipp=3.17.0=pyhd8ed1ab_0 - zlib=1.2.13=hd590300_5 - zstd=1.5.5=hfc55251_0 - pip: - c3s-eqc-automatic-quality-control==0.1.2.dev97+g355c9b6 - cacholote==0.5.3 - cads-toolbox==0.0.2b0 - cgul==0.0.4 - coucal==0.0.1b3 - emohawk==0.0.4b0 - kaleido==0.2.1 - pymannkendall==1.4.3 prefix: /data/common/miniforge3/envs/wp5name: wp5 channels: - conda-forge dependencies: - _libgcc_mutex=0.1=conda_forge - _openmp_mutex=4.5=2_gnu - affine=2.4.0=pyhd8ed1ab_0 - aiohttp=3.9.1=py311h459d7ec_0 - aiosignal=1.3.1=pyhd8ed1ab_0 - alsa-lib=1.2.10=hd590300_0 - annotated-types=0.6.0=pyhd8ed1ab_0 - ansiwrap=0.8.4=py_0 - antlr-python-runtime=4.11.1=pyhd8ed1ab_0 - anyio=4.2.0=pyhd8ed1ab_0 - argon2-cffi=23.1.0=pyhd8ed1ab_0 - argon2-cffi-bindings=21.2.0=py311h459d7ec_4 - arrow=1.3.0=pyhd8ed1ab_0 - asciitree=0.3.3=py_2 - asttokens=2.4.1=pyhd8ed1ab_0 - async-lru=2.0.4=pyhd8ed1ab_0 - attr=2.5.1=h166bdaf_1 - attrs=23.1.0=pyh71513ae_1 - aws-c-auth=0.7.8=h538f98c_2 - aws-c-cal=0.6.9=h5d48c4d_2 - aws-c-common=0.9.10=hd590300_0 - aws-c-compression=0.2.17=h7f92143_7 - aws-c-event-stream=0.3.2=h0bcb0bb_8 - aws-c-http=0.7.14=hd268abd_3 - aws-c-io=0.13.36=he0cd244_2 - aws-c-mqtt=0.10.0=h35285c7_0 - aws-c-s3=0.4.5=h0448019_0 - aws-c-sdkutils=0.1.13=h7f92143_0 - aws-checksums=0.1.17=h7f92143_6 - aws-crt-cpp=0.25.0=h1bbe558_2 - aws-sdk-cpp=1.11.210=h0853bfa_5 - azure-core-cpp=1.10.3=h91d86a7_0 - azure-storage-blobs-cpp=12.10.0=h00ab1b0_0 - azure-storage-common-cpp=12.5.0=hb858b4b_2 - babel=2.14.0=pyhd8ed1ab_0 - beautifulsoup4=4.12.2=pyha770c72_0 - black=23.11.0=py311h38be061_0 - bleach=6.1.0=pyhd8ed1ab_0 - blosc=1.21.5=h0f2a231_0 - bokeh=3.3.2=pyhd8ed1ab_0 - bottleneck=1.3.7=py311h1f0f07a_1 - branca=0.7.0=pyhd8ed1ab_1 - brotli=1.1.0=hd590300_1 - brotli-bin=1.1.0=hd590300_1 - brotli-python=1.1.0=py311hb755f60_1 - bzip2=1.0.8=hd590300_5 - c-ares=1.24.0=hd590300_0 - ca-certificates=2023.11.17=hbcca054_0 - cached-property=1.5.2=hd8ed1ab_1 - cached_property=1.5.2=pyha770c72_1 - cairo=1.18.0=h3faef2a_0 - cartopy=0.22.0=py311h320fe9a_1 - cdsapi=0.6.1=pyhd8ed1ab_0 - certifi=2023.11.17=pyhd8ed1ab_0 - cf-units=3.2.0=py311h1f0f07a_4 - cf_xarray=0.8.7=pyhd8ed1ab_0 - cffi=1.16.0=py311hb3a22ac_0 - cfgrib=0.9.10.4=pyhd8ed1ab_0 - cfitsio=4.3.1=hbdc6101_0 - cftime=1.6.3=py311h1f0f07a_0 - charset-normalizer=3.3.2=pyhd8ed1ab_0 - click=8.1.7=unix_pyh707e725_0 - click-plugins=1.1.1=py_0 - cligj=0.7.2=pyhd8ed1ab_1 - cloudpickle=3.0.0=pyhd8ed1ab_0 - colorama=0.4.6=pyhd8ed1ab_0 - comm=0.1.4=pyhd8ed1ab_0 - contourpy=1.2.0=py311h9547e67_0 - cycler=0.12.1=pyhd8ed1ab_0 - cytoolz=0.12.2=py311h459d7ec_1 - dask=2023.12.1=pyhd8ed1ab_0 - dask-core=2023.12.1=pyhd8ed1ab_0 - dbus=1.13.6=h5008d03_3 - debugpy=1.8.0=py311hb755f60_1 - decorator=5.1.1=pyhd8ed1ab_0 - defusedxml=0.7.1=pyhd8ed1ab_0 - distributed=2023.12.1=pyhd8ed1ab_0 - eccodes=2.33.0=he84ddb8_0 - entrypoints=0.4=pyhd8ed1ab_0 - esmf=8.4.2=nompi_h9e768e6_3 - esmpy=8.4.2=pyhc1e730c_4 - exceptiongroup=1.2.0=pyhd8ed1ab_0 - executing=2.0.1=pyhd8ed1ab_0 - expat=2.5.0=hcb278e6_1 - fasteners=0.17.3=pyhd8ed1ab_0 - findlibs=0.0.5=pyhd8ed1ab_0 - fiona=1.9.5=py311hf8e0aa6_2 - flox=0.8.5=pyhd8ed1ab_0 - folium=0.15.1=pyhd8ed1ab_0 - font-ttf-dejavu-sans-mono=2.37=hab24e00_0 - font-ttf-inconsolata=3.000=h77eed37_0 - font-ttf-source-code-pro=2.038=h77eed37_0 - font-ttf-ubuntu=0.83=h77eed37_1 - fontconfig=2.14.2=h14ed4e7_0 - fonts-conda-ecosystem=1=0 - fonts-conda-forge=1=0 - fonttools=4.47.0=py311h459d7ec_0 - fqdn=1.5.1=pyhd8ed1ab_0 - freeglut=3.2.2=hac7e632_2 - freetype=2.12.1=h267a509_2 - freexl=2.0.0=h743c826_0 - frozenlist=1.4.1=py311h459d7ec_0 - fsspec=2023.12.2=pyhca7485f_0 - gdal=3.8.1=py311h39b4e0e_4 - geopandas=0.14.1=pyhd8ed1ab_0 - geopandas-base=0.14.1=pyha770c72_0 - geos=3.12.1=h59595ed_0 - geotiff=1.7.1=h6b2125f_15 - gettext=0.21.1=h27087fc_0 - gflags=2.2.2=he1b5a44_1004 - giflib=5.2.1=h0b41bf4_3 - glib=2.78.3=hfc55251_0 - glib-tools=2.78.3=hfc55251_0 - glog=0.6.0=h6f12383_0 - gmp=6.3.0=h59595ed_0 - graphite2=1.3.13=h58526e2_1001 - greenlet=3.0.2=py311hb755f60_0 - gst-plugins-base=1.22.8=h8e1006c_0 - gstreamer=1.22.8=h98fc4e7_0 - harfbuzz=8.3.0=h3d44ed6_0 - hdf4=4.2.15=h2a13503_7 - hdf5=1.14.3=nompi_h4f84152_100 - icu=73.2=h59595ed_0 - idna=3.6=pyhd8ed1ab_0 - importlib-metadata=7.0.0=pyha770c72_0 - importlib_metadata=7.0.0=hd8ed1ab_0 - importlib_resources=6.1.1=pyhd8ed1ab_0 - ipykernel=6.26.0=pyhf8b6a83_0 - ipython=8.18.1=pyh707e725_3 - isoduration=20.11.0=pyhd8ed1ab_0 - jasper=4.1.0=he6dfbbe_0 - jedi=0.19.1=pyhd8ed1ab_0 - jinja2=3.1.2=pyhd8ed1ab_1 - joblib=1.3.2=pyhd8ed1ab_0 - json-c=0.17=h7ab15ed_0 - json5=0.9.14=pyhd8ed1ab_0 - jsonpointer=2.4=py311h38be061_3 - jsonschema=4.20.0=pyhd8ed1ab_0 - jsonschema-specifications=2023.11.2=pyhd8ed1ab_0 - jsonschema-with-format-nongpl=4.20.0=pyhd8ed1ab_0 - jupyter-lsp=2.2.1=pyhd8ed1ab_0 - jupyter_client=8.6.0=pyhd8ed1ab_0 - jupyter_core=5.5.1=py311h38be061_0 - jupyter_events=0.9.0=pyhd8ed1ab_0 - jupyter_server=2.12.1=pyhd8ed1ab_0 - jupyter_server_terminals=0.5.0=pyhd8ed1ab_0 - jupyterlab=4.0.9=pyhd8ed1ab_0 - jupyterlab_pygments=0.3.0=pyhd8ed1ab_0 - jupyterlab_server=2.25.2=pyhd8ed1ab_0 - kealib=1.5.2=hcd42e92_1 - keyutils=1.6.1=h166bdaf_0 - kiwisolver=1.4.5=py311h9547e67_1 - krb5=1.21.2=h659d440_0 - lame=3.100=h166bdaf_1003 - lcms2=2.16=hb7c19ff_0 - ld_impl_linux-64=2.40=h41732ed_0 - lerc=4.0.0=h27087fc_0 - libabseil=20230802.1=cxx17_h59595ed_0 - libaec=1.1.2=h59595ed_1 - libarchive=3.7.2=h2aa1ff5_1 - libarrow=14.0.2=hfb4d3a9_0_cpu - libarrow-acero=14.0.2=h59595ed_0_cpu - libarrow-dataset=14.0.2=h59595ed_0_cpu - libarrow-flight=14.0.2=h120cb0d_0_cpu - libarrow-flight-sql=14.0.2=h61ff412_0_cpu - libarrow-gandiva=14.0.2=hacb8726_0_cpu - libarrow-substrait=14.0.2=h61ff412_0_cpu - libblas=3.9.0=20_linux64_openblas - libboost-headers=1.84.0=ha770c72_0 - libbrotlicommon=1.1.0=hd590300_1 - libbrotlidec=1.1.0=hd590300_1 - libbrotlienc=1.1.0=hd590300_1 - libcap=2.69=h0f662aa_0 - libcblas=3.9.0=20_linux64_openblas - libclang=15.0.7=default_hb11cfb5_4 - libclang13=15.0.7=default_ha2b6cf4_4 - libcrc32c=1.1.2=h9c3ff4c_0 - libcups=2.3.3=h4637d8d_4 - libcurl=8.5.0=hca28451_0 - libdeflate=1.19=hd590300_0 - libedit=3.1.20191231=he28a2e2_2 - libev=4.33=hd590300_2 - libevent=2.1.12=hf998b51_1 - libexpat=2.5.0=hcb278e6_1 - libffi=3.4.2=h7f98852_5 - libflac=1.4.3=h59595ed_0 - libgcc-ng=13.2.0=h807b86a_3 - libgcrypt=1.10.3=hd590300_0 - libgdal=3.8.1=hed8bd54_4 - libgfortran-ng=13.2.0=h69a702a_3 - libgfortran5=13.2.0=ha4646dd_3 - libglib=2.78.3=h783c2da_0 - libglu=9.0.0=hac7e632_1003 - libgomp=13.2.0=h807b86a_3 - libgoogle-cloud=2.12.0=h5206363_4 - libgpg-error=1.47=h71f35ed_0 - libgrpc=1.59.3=hd6c4280_0 - libiconv=1.17=hd590300_2 - libjpeg-turbo=3.0.0=hd590300_1 - libkml=1.3.0=h01aab08_1018 - liblapack=3.9.0=20_linux64_openblas - libllvm14=14.0.6=hcd5def8_4 - libllvm15=15.0.7=hb3ce162_4 - libnetcdf=4.9.2=nompi_h9612171_113 - libnghttp2=1.58.0=h47da74e_1 - libnl=3.9.0=hd590300_0 - libnsl=2.0.1=hd590300_0 - libnuma=2.0.16=h0b41bf4_1 - libogg=1.3.4=h7f98852_1 - libopenblas=0.3.25=pthreads_h413a1c8_0 - libopus=1.3.1=h7f98852_1 - libparquet=14.0.2=h352af49_0_cpu - libpng=1.6.39=h753d276_0 - libpq=16.1=h33b98f1_7 - libprotobuf=4.24.4=hf27288f_0 - libre2-11=2023.06.02=h7a70373_0 - librttopo=1.1.0=h8917695_15 - libsndfile=1.2.2=hc60ed4a_1 - libsodium=1.0.18=h36c2ea0_1 - libspatialindex=1.9.3=h9c3ff4c_4 - libspatialite=5.1.0=h7bd4643_4 - libsqlite=3.44.2=h2797004_0 - libssh2=1.11.0=h0841786_0 - libstdcxx-ng=13.2.0=h7e041cc_3 - libsystemd0=255=h3516f8a_0 - libthrift=0.19.0=hb90f79a_1 - libtiff=4.6.0=ha9c0a0a_2 - libudunits2=2.2.28=h40f5838_3 - libutf8proc=2.8.0=h166bdaf_0 - libuuid=2.38.1=h0b41bf4_0 - libvorbis=1.3.7=h9c3ff4c_0 - libwebp-base=1.3.2=hd590300_0 - libxcb=1.15=h0b41bf4_0 - libxkbcommon=1.6.0=hd429924_1 - libxml2=2.12.3=h232c23b_0 - libzip=1.10.1=h2629f0a_3 - libzlib=1.2.13=hd590300_5 - llvmlite=0.41.1=py311ha6695c7_0 - locket=1.0.0=pyhd8ed1ab_0 - lz4=4.3.2=py311h38e4bf4_1 - lz4-c=1.9.4=hcb278e6_0 - lzo=2.10=h516909a_1000 - mapclassify=2.6.1=pyhd8ed1ab_0 - markdown-it-py=3.0.0=pyhd8ed1ab_0 - markupsafe=2.1.3=py311h459d7ec_1 - matplotlib=3.8.2=py311h38be061_0 - matplotlib-base=3.8.2=py311h54ef318_0 - matplotlib-inline=0.1.6=pyhd8ed1ab_0 - mdurl=0.1.0=pyhd8ed1ab_0 - minizip=4.0.3=h0ab5242_0 - mistune=3.0.2=pyhd8ed1ab_0 - mpg123=1.32.3=h59595ed_0 - msgpack-python=1.0.7=py311h9547e67_0 - multidict=6.0.4=py311h459d7ec_1 - munkres=1.1.4=pyh9f0ad1d_0 - mypy_extensions=1.0.0=pyha770c72_0 - mysql-common=8.0.33=hf1915f5_6 - mysql-libs=8.0.33=hca2cd23_6 - nbclient=0.8.0=pyhd8ed1ab_0 - nbconvert=7.13.0=pyhd8ed1ab_0 - nbconvert-core=7.13.0=pyhd8ed1ab_0 - nbconvert-pandoc=7.13.0=pyhd8ed1ab_0 - nbformat=5.9.2=pyhd8ed1ab_0 - nc-time-axis=1.4.1=pyhd8ed1ab_0 - ncurses=6.4=h59595ed_2 - nest-asyncio=1.5.8=pyhd8ed1ab_0 - netcdf-fortran=4.6.1=nompi_hacb5139_103 - netcdf4=1.6.5=nompi_py311he8ad708_100 - networkx=3.2.1=pyhd8ed1ab_0 - notebook-shim=0.2.3=pyhd8ed1ab_0 - nspr=4.35=h27087fc_0 - nss=3.96=h1d7d5a4_0 - numba=0.58.1=py311h96b013e_0 - numcodecs=0.12.1=py311hb755f60_0 - numpy=1.26.2=py311h64a7726_0 - numpy_groupies=0.10.2=pyhd8ed1ab_0 - openjpeg=2.5.0=h488ebb8_3 - openssl=3.2.0=hd590300_1 - orc=1.9.2=h4b38347_0 - overrides=7.4.0=pyhd8ed1ab_0 - packaging=23.2=pyhd8ed1ab_0 - pandas=2.1.4=py311h320fe9a_0 - pandoc=3.1.3=h32600fe_0 - pandocfilters=1.5.0=pyhd8ed1ab_0 - papermill=2.4.0=pyhd8ed1ab_0 - parso=0.8.3=pyhd8ed1ab_0 - partd=1.4.1=pyhd8ed1ab_0 - pathspec=0.12.1=pyhd8ed1ab_0 - patsy=0.5.4=pyhd8ed1ab_0 - pcre2=10.42=hcad00b1_0 - pexpect=4.8.0=pyh1a96a4e_2 - pickleshare=0.7.5=py_1003 - pillow=10.1.0=py311ha6c5da5_0 - pip=23.3.2=pyhd8ed1ab_0 - pixman=0.42.2=h59595ed_0 - pkgutil-resolve-name=1.3.10=pyhd8ed1ab_1 - platformdirs=4.1.0=pyhd8ed1ab_0 - plotly=5.18.0=pyhd8ed1ab_0 - ply=3.11=py_1 - pooch=1.8.0=pyhd8ed1ab_0 - poppler=23.12.0=h590f24d_0 - poppler-data=0.4.12=hd8ed1ab_0 - postgresql=16.1=h7387d8b_7 - proj=9.3.1=h1d62c97_0 - prometheus_client=0.19.0=pyhd8ed1ab_0 - prompt-toolkit=3.0.42=pyha770c72_0 - properscoring=0.1=py_0 - psutil=5.9.7=py311h459d7ec_0 - pthread-stubs=0.4=h36c2ea0_1001 - ptyprocess=0.7.0=pyhd3deb0d_0 - pulseaudio-client=16.1=hb77b528_5 - pure_eval=0.2.2=pyhd8ed1ab_0 - pwlf=2.2.1=py311h38be061_3 - pyarrow=14.0.2=py311h39c9aba_0_cpu - pyarrow-hotfix=0.6=pyhd8ed1ab_0 - pycparser=2.21=pyhd8ed1ab_0 - pydantic=2.5.2=pyhd8ed1ab_0 - pydantic-core=2.14.5=py311h46250e7_0 - pydantic-settings=2.1.0=pyhd8ed1ab_1 - pydoe=0.3.8=py_1 - pygments=2.17.2=pyhd8ed1ab_0 - pyparsing=3.1.1=pyhd8ed1ab_0 - pyproj=3.6.1=py311hca0b8b9_5 - pyqt=5.15.9=py311hf0fb5b6_5 - pyqt5-sip=12.12.2=py311hb755f60_5 - pyshp=2.3.1=pyhd8ed1ab_0 - pysocks=1.7.1=pyha2e5f31_6 - python=3.11.7=hab00c5b_0_cpython - python-dateutil=2.8.2=pyhd8ed1ab_0 - python-dotenv=1.0.0=pyhd8ed1ab_1 - python-eccodes=1.6.1=py311h1f0f07a_1 - python-fastjsonschema=2.19.0=pyhd8ed1ab_0 - python-json-logger=2.0.7=pyhd8ed1ab_0 - python-tzdata=2023.3=pyhd8ed1ab_0 - python_abi=3.11=4_cp311 - pytz=2023.3.post1=pyhd8ed1ab_0 - pyyaml=6.0.1=py311h459d7ec_1 - pyzmq=25.1.2=py311h34ded2d_0 - qt-main=5.15.8=h450f30e_18 - rasterio=1.3.9=py311ha38370a_2 - rdma-core=49.0=hd3aeb46_2 - re2=2023.06.02=h2873b5e_0 - readline=8.2=h8228510_1 - referencing=0.32.0=pyhd8ed1ab_0 - regionmask=0.11.0=pyhd8ed1ab_0 - requests=2.31.0=pyhd8ed1ab_0 - rfc3339-validator=0.1.4=pyhd8ed1ab_0 - rfc3986-validator=0.1.1=pyh9f0ad1d_0 - rich=13.7.0=pyhd8ed1ab_0 - rioxarray=0.15.0=pyhd8ed1ab_0 - rpds-py=0.15.2=py311h46250e7_0 - rtree=1.1.0=py311h3bb2b0f_0 - s2n=1.4.0=h06160fa_0 - scikit-learn=1.3.2=py311hc009520_2 - scipy=1.11.4=py311h64a7726_0 - seaborn=0.13.0=hd8ed1ab_0 - seaborn-base=0.13.0=pyhd8ed1ab_0 - send2trash=1.8.2=pyh41d4057_0 - setuptools=68.2.2=pyhd8ed1ab_0 - shapely=2.0.2=py311h2032efe_1 - shellingham=1.5.4=pyhd8ed1ab_0 - sip=6.7.12=py311hb755f60_0 - six=1.16.0=pyh6c4a22f_0 - snappy=1.1.10=h9fff704_0 - sniffio=1.3.0=pyhd8ed1ab_0 - snuggs=1.4.7=py_0 - sortedcontainers=2.4.0=pyhd8ed1ab_0 - soupsieve=2.5=pyhd8ed1ab_1 - sparse=0.14.0=pyhd8ed1ab_0 - sqlalchemy=2.0.23=py311h459d7ec_0 - sqlite=3.44.2=h2c6b66d_0 - stack_data=0.6.2=pyhd8ed1ab_0 - statsmodels=0.14.1=py311h1f0f07a_0 - structlog=23.2.0=pyhd8ed1ab_0 - tblib=3.0.0=pyhd8ed1ab_0 - tenacity=8.2.3=pyhd8ed1ab_0 - terminado=0.18.0=pyh0d859eb_0 - textwrap3=0.9.2=py_0 - threadpoolctl=3.2.0=pyha21a80b_0 - tiledb=2.18.3=hc1131af_1 - tinycss2=1.2.1=pyhd8ed1ab_0 - tk=8.6.13=noxft_h4845f30_101 - toml=0.10.2=pyhd8ed1ab_0 - tomli=2.0.1=pyhd8ed1ab_0 - toolz=0.12.0=pyhd8ed1ab_0 - tornado=6.3.3=py311h459d7ec_1 - tqdm=4.66.1=pyhd8ed1ab_0 - traitlets=5.14.0=pyhd8ed1ab_0 - typer=0.9.0=pyhd8ed1ab_0 - types-python-dateutil=2.8.19.14=pyhd8ed1ab_0 - typing-extensions=4.9.0=hd8ed1ab_0 - typing_extensions=4.9.0=pyha770c72_0 - typing_utils=0.1.0=pyhd8ed1ab_0 - tzcode=2023c=h0b41bf4_0 - tzdata=2023c=h71feb2d_0 - ucx=1.15.0=h75e419f_2 - udunits2=2.2.28=h40f5838_3 - uri-template=1.3.0=pyhd8ed1ab_0 - uriparser=0.9.7=hcb278e6_1 - urllib3=2.1.0=pyhd8ed1ab_0 - wcwidth=0.2.12=pyhd8ed1ab_0 - webcolors=1.13=pyhd8ed1ab_0 - webencodings=0.5.1=pyhd8ed1ab_2 - websocket-client=1.7.0=pyhd8ed1ab_0 - wheel=0.42.0=pyhd8ed1ab_0 - xarray=2023.12.0=pyhd8ed1ab_0 - xarraymannkendall=1.4.5=pyhd8ed1ab_0 - xcb-util=0.4.0=hd590300_1 - xcb-util-image=0.4.0=h8ee46fc_1 - xcb-util-keysyms=0.4.0=h8ee46fc_1 - xcb-util-renderutil=0.3.9=hd590300_1 - xcb-util-wm=0.4.1=h8ee46fc_1 - xerces-c=3.2.4=hac6953d_3 - xesmf=0.8.2=pyhd8ed1ab_0 - xhistogram=0.3.2=pyhd8ed1ab_0 - xkeyboard-config=2.40=hd590300_0 - xorg-fixesproto=5.0=h7f98852_1002 - xorg-inputproto=2.3.2=h7f98852_1002 - xorg-kbproto=1.0.7=h7f98852_1002 - xorg-libice=1.1.1=hd590300_0 - xorg-libsm=1.2.4=h7391055_0 - xorg-libx11=1.8.7=h8ee46fc_0 - xorg-libxau=1.0.11=hd590300_0 - xorg-libxdmcp=1.1.3=h7f98852_0 - xorg-libxext=1.3.4=h0b41bf4_2 - xorg-libxfixes=5.0.3=h7f98852_1004 - xorg-libxi=1.7.10=h7f98852_0 - xorg-libxrender=0.9.11=hd590300_0 - xorg-renderproto=0.11.1=h7f98852_1002 - xorg-xextproto=7.3.0=h0b41bf4_1003 - xorg-xf86vidmodeproto=2.3.1=h7f98852_1002 - xorg-xproto=7.0.31=h7f98852_1007 - xskillscore=0.0.24=pyhd8ed1ab_0 - xyzservices=2023.10.1=pyhd8ed1ab_0 - xz=5.2.6=h166bdaf_0 - yaml=0.2.5=h7f98852_2 - yarl=1.9.3=py311h459d7ec_0 - zarr=2.16.1=pyhd8ed1ab_0 - zeromq=4.3.5=h59595ed_0 - zict=3.0.0=pyhd8ed1ab_0 - zipp=3.17.0=pyhd8ed1ab_0 - zlib=1.2.13=hd590300_5 - zstd=1.5.5=hfc55251_0 - pip: - c3s-eqc-automatic-quality-control==0.1.2.dev97+g355c9b6 - cacholote==0.5.3 - cads-toolbox==0.0.2b0 - cgul==0.0.4 - coucal==0.0.1b3 - emohawk==0.0.4b0 - kaleido==0.2.1 - pymannkendall==1.4.3 prefix: /data/common/miniforge3/envs/wp5
malmans2 commented 9 months ago

The log in your notebook shows that the request was succesfully queued, so I guess the long wait is due to long CDS queue time. If the CDS is very busy right now, there's not much I can do about it.

I know some trick to speed up the download though (e.g., concurrent request). I'll run the scripts overnight and will let you know tomorrow how it goes.

BTW, if you use your own cdsapirc rather than the default, you can easily check the status of your requests on the CDS website.

FabioMangini commented 9 months ago

Ok! Thank you very much.

malmans2 commented 9 months ago

Hi there, Sorry but I didn't get to work on this yesterday.

I'm caching the data you need right now, I'll let you know when it's ready. This is the data I'm caching:

from c3s_eqc_automatic_quality_control import download

year_start = 1990
year_stop = 2023

collection_id = "reanalysis-carra-single-levels"
request = {
    "level_type": "surface_or_atmosphere",
    "variable": "total_precipitation",
    "product_type": "forecast",
    "time": ["00:00", "12:00"],
    "year": [str(year) for year in range(year_start, year_stop + 1)],
    "month": ["01", "02", "12"],
    "day": [f"{day:02d}" for day in range(1, 32)],
    "leadtime_hour": ["6", "12", "18"],
}

datasets = {}
for domain in ["east_domain", "west_domain"]:
    datasets[domain] = download.download_and_transform(
        collection_id,
        request | {"domain": domain},
        chunks={"year": 1},
    )

Is this the data you need, right?

FabioMangini commented 9 months ago

No problem, and thank you for your help.

I would like to ask you a question about the request:

    request = {
        "level_type": "surface_or_atmosphere",
        "variable": "total_precipitation",
        "product_type": "forecast",
        "time": ["00:00", "12:00"],
        "year": [str(year) for year in range(year_start, year_stop + 1)],
        "month": ["01", "02", "12"],
        "day": [f"{day:02d}" for day in range(1, 32)],
        "leadtime_hour": ["6", "12", "18"],
    }

This would download the total precipitation for each day of January, February, and December, at time 00:00 and 12:00, and leadtime hours 6, 12, and 18.

Do you know if it is possible to modify the request for it to also download the total precipitation for the 30th of November? Or, do I need to download the total precipitation for all the days of November even if I am only interested in the last day of the month?

malmans2 commented 9 months ago

We can add a single day in a separate request:

from c3s_eqc_automatic_quality_control import download

year_start = 1990
year_stop = 2023

collection_id = "reanalysis-carra-single-levels"
request = {
    "level_type": "surface_or_atmosphere",
    "variable": "total_precipitation",
    "product_type": "forecast",
    "time": ["00:00", "12:00"],
    "leadtime_hour": ["6", "12", "18"],
}
time_requests = [
    {
        "month": ["01", "02"],
        "day": [f"{day:02d}" for day in range(1, 32)],
        "year": [str(year) for year in range(year_start + 1, year_stop + 1)],
    },
    {
        "month": "11",
        "day": "30",
        "year": [str(year) for year in range(year_start, year_stop)],
    },
    {
        "month": "12",
        "day": [f"{day:02d}" for day in range(1, 32)],
        "year": [str(year) for year in range(year_start, year_stop)],
    },
]

datasets = {}
for domain in ["east", "west"]:
    print(f"{domain=}")
    datasets[domain] = download.download_and_transform(
        collection_id,
        [
            request | {"domain": f"{domain}_domain"} | time_request
            for time_request in time_requests
        ],
        chunks={"year": 1},
    )
FabioMangini commented 9 months ago

I understand, thank you very much. The request that you wrote in your last message seems fine to me.

malmans2 commented 9 months ago

Unfortunately I realised that we need to change a little the request to achieve your goal (I updated the snippet above). I have to re-run the caching script, it will probably finish later tonight.

I'll be in touch when it's done.

malmans2 commented 9 months ago

east domain is ready. It takes a couple of minutes to retrieve and concatenate all cached files, but it should be much faster when we'll add the transform_func to cache transformed data.

I'll let you know when the west domain will be ready.

malmans2 commented 9 months ago

All done. If you use this code, CARRA is cached.

FabioMangini commented 9 months ago

Thank you very much!

FabioMangini commented 9 months ago

Hi,

I would like the Jupyter Notebook to also include in-situ observations of temperature and precipitation at five weather stations on Svalbard since September 1990.

Met Norway allows users to download these data through an API. This is explained on this webpage: https://frost.met.no/index.html.

As a test, I created a Jupyter Notebook on my laptop to download the in-situ data. However, this required me to register my email address on the Met Norway’s website (https://frost.met.no/auth/requestCredentials.html) to request and obtain my client credentials (a “client ID” and a “client secret”). Indeed, the “client ID” is needed to download the data.

I would now like to add this Notebook to the one that downloads the CARRA dataset. However, I would prefer not to use my “client ID”. So, I was wondering about an alternative.

For example, does there exist an email address related to this project that I could use to get new client credentials? Otherwise, could I download the in-situ data and store them on a server that is accessible to the Jupyter Notebook?

malmans2 commented 9 months ago

Good morning @FabioMangini and happy new year.

I think you should get in touch with Chunxue and discuss how to proceed. So far we only used external data that are publicly available and don't need credentials. I see a couple of problems in both solutions:

  1. Use a shared ID: The notebook is going to be public and potentially used by many people. Sharing an "EQC ID" with external people might violate the data copyrights (there's probably a good reason if Met Norway decided to use a private ID, we would basically share a workaround with many people).
  2. Store data on a server that is accessible to the notebook: Similarly, that might violate the copyrights of the data. If we are allowed to store the data somewhere (a virtual machine, a cloud, ...), you should decide with Chunxue and ECMWF where to store it (at the moment we don't have dedicated storage for that).

We can also just explain in the notebook to the users how to download the data using their own ID, but again this is something that you need to discuss with Chunxue and ECMWF.

FabioMangini commented 9 months ago

Good morning and happy new year to you!

Thank you very much. I understand your points. I will soon contact Chunxue to discuss the problem with her.

FabioMangini commented 9 months ago

Good morning,

I contacted Chunxue. She will work on a solution. In the meantime, she suggested that I should continue working on the project by storing the data on a virtual machine.

I was wondering whether it were possible for you to set up a virtual machine where I can store the data. In case it were possible, do you know how much time it would take for you to create this virtual machine?

malmans2 commented 9 months ago

Hi @FabioMangini,

B-Open does not manage ECMWF resources. In the EQC contract, we are only managing the VM that has been assigned by ECMWF to EQC for notebooks. We can not create new VMs using ECMWF resources.

If you think that EQC needs a dedicated VM to make external data publicly available, a formal request must be done to ECMWF (i.e., Chunxue should probably make a formal request to ECMWF, and if it's accepted ECMWF will provide a new VM).

Side note: I'm not sure if ECMWF can share external data like that, usually this is done through the CDS. But again, this is something that needs to be discussed with ECMWF, B-Open can not make this kind of decisions.

FabioMangini commented 9 months ago

Hi @malmans2,

Thank you very much for your reply.

FabioMangini commented 8 months ago

Hi,

The Jupyter Notebook can now download six datasets:

• ERA5’s 2m temperature over Svalbard (region between 75°-82°N and 8°-40°E) • ERA5’s precipitation over Svalbard • CARRA’s 2m temperature over CARRA-West domain • CARRA’s 2m temperature over CARRA-East domain
• CARRA’s total precipitation over CARRA-West domain • CARRA’s total precipitation over CARRA-East domain

I would like to crop the four CARRA datasets over Svalbard (region between 75°-82°N, and 8°-40°E), but the kernel dies when I use the method “.where”. I was wondering if you could help me solve this problem.

I have attached to this message a zipped version of a notebook that downloads all the data and tries to crop the CARRA-West's 2m temperature over Svalbard.

script_crop_svalbard_region.ipynb.zip

malmans2 commented 8 months ago

I'll take a look next week. In the meantime, could you please try our utility function? (It will probably fail the same way, but it's worth trying)

from c3s_eqc_automatic_quality_control import utils

ds = utils.regionalise(
    ds,
    lon_slice=slice(lon_start, lon_end),
    lat_slice=slice(lat_start, lat_end),
)
FabioMangini commented 8 months ago

Hi,

Thank you very much.

I made a test, but I have seen that the kernel also dies when I run the utility function above.

malmans2 commented 8 months ago

Hi @FabioMangini,

Do you need to apply the same mask to both the East and West domain (75°-82°N and 8°-40°E)? Also, do you want to mask ERA5 as well?

FabioMangini commented 8 months ago

Hi @malmans2,

yes, I need the same mask for CARRA-East, CARRA-West, and ERA5.

malmans2 commented 8 months ago

One more question. For ERA5, do you want a single time dimension or 2 time dimensions (two time dimension means that the leadtime dimension is kept separate).

malmans2 commented 8 months ago

Here is how I would do it:

from c3s_eqc_automatic_quality_control import download, utils

# Parameters
year_start = 1990
year_stop = 1992
area = [82, 8, 75, 40]

# Request
request_dict = {
    "CARRA_analysis": (
        "reanalysis-carra-single-levels",
        {
            "level_type": "surface_or_atmosphere",
            "variable": "2m_temperature",
            "product_type": "analysis",
            "time": [f"{hour:02d}:00" for hour in range(0, 24, 3)],
        },
    ),
    "CARRA_forecast": (
        "reanalysis-carra-single-levels",
        {
            "level_type": "surface_or_atmosphere",
            "variable": "total_precipitation",
            "product_type": "forecast",
            "time": ["00:00", "12:00"],
            "leadtime_hour": ["6", "12", "18"],
        },
    ),
    "ERA5_t2m": (
        "reanalysis-era5-single-levels",
        {
            "product_type": "reanalysis",
            "variable": "2m_temperature",
            "time": [f"{hour:02d}:00" for hour in range(24)],
            "area": [82, 8, 75, 40],
        },
    ),
    "ERA5_precip": (
        "reanalysis-era5-single-levels",
        {
            "product_type": "reanalysis",
            "variable": "total_precipitation",
            "time": [f"{hour:02d}:00" for hour in range(24)],
            "area": [82, 8, 75, 40],
        },
    ),
}

time_requests_carra = [
    {
        "month": ["01", "02"],
        "day": [f"{day:02d}" for day in range(1, 32)],
        "year": [str(year) for year in range(year_start + 1, year_stop + 1)],
    },
    {
        "month": "11",
        "day": "30",
        "year": [str(year) for year in range(year_start, year_stop)],
    },
    {
        "month": "12",
        "day": [f"{day:02d}" for day in range(1, 32)],
        "year": [str(year) for year in range(year_start, year_stop)],
    },
]

time_requests_era5 = time_requests_carra + [
    {
        "month": "03",
        "day": "01",
        "year": [str(year) for year in range(year_start + 1, year_stop + 1)],
    },
]

# Download and transform
datasets = {}
for product, (collection_id, request) in request_dict.items():
    print(f"{product=}")
    if product.startswith("CARRA"):
        for domain in ["east", "west"]:
            print(f"{domain=}")
            datasets[product + "_" + domain] = download.download_and_transform(
                collection_id,
                [
                    request | {"domain": f"{domain}_domain"} | time_request
                    for time_request in time_requests_carra
                ],
                transform_func=utils.regionalise,
                transform_func_kwargs={
                    "lon_slice": slice(area[1], area[3]),
                    "lat_slice": slice(area[2], area[0]),
                },
                chunks={"year": 1},
            )
    elif product.startswith("ERA5"):
        datasets[product] = download.download_and_transform(
            collection_id,
            [request | time_request for time_request in time_requests_era5],
            chunks={"year": 1},
        )
    else:
        raise ValueError(f"{product=}")

A couple of comments:

  1. All sliced datasets are now cached separately
  2. ERA5 has a single time dimension (see the backend_kwargs argument passed to cfgrib)
  3. I've only cached a couple of years for now (I suggest to use just a couple of years while developing, we will then optimize and run the whole time period when the analysis is more mature)
FabioMangini commented 8 months ago

Hi,

One more question. For ERA5, do you want a single time dimension or 2 time dimensions (two time dimension means that the leadtime dimension is kept separate).

yes, I would keep the two time dimensions separate for ERA5.

malmans2 commented 8 months ago

Got it. I edited the snippet above and restored the leadtime dimension for era5 precipitation.

FabioMangini commented 8 months ago

Thank you very much.

FabioMangini commented 7 months ago

Hi @malmans2,

I computed the daily accumulated precipitation using the CARRA-West and the CARRA-East reanalysis datasets, but I found these estimates to poorly agree with the weather stations’ observations. This might indicate that the reanalysis data do not perform well over Svalbard, but it might also highlight an error in my script. I was wondering whether it were possible for you and B-Open to check the script I wrote.

What puzzles me is that the comparison between CARRA (both CARRA-West and CARRA-East) and the weather stations returns better results when I compare the daily precipitation from CARRA and the weather stations with a one-day lag. To put it differently, the agreement between the reanalysis and the observations increases when I compare the daily precipitation from CARRA for a given day with the daily precipitation from the weather stations for the day after.

To compute the daily accumulated precipitation from CARRA I followed the instructions found on the CARRA’s documentation. You would be able to find them at this link: The use of precipitation information from the Copernicus Arctic Regional Reanalysis (CARRA) - Copernicus Knowledge Base - ECMWF Confluence Wiki.

The weather stations provide daily precipitation accumulated between 06:00 of given day and 06:00 of the following day. So, I calculated the precipitation from CARRA-West and CARRA-East between 06:00 of a given day and 06:00 the following day to allow a comparison with the weather stations.

I share with you the Jupyter Notebook that downloads temperature and precipitation from CARRA-West, CARRA-East and computes the daily accumulated precipitation for winter (December-January-February). Please, let me know if you would like to have the script that downloads and reprocesses the weather stations’ observations.

precipitation_CARRA.ipynb.zip

malmans2 commented 7 months ago

Hi @FabioMangini,

Sorry about the delay, I've been working on a couple of notebooks for WP4. I'll take a look later today and will let you know!

malmans2 commented 7 months ago

Hi @FabioMangini,

Could you please check if these results look better?

def subtract_and_assign_time(da, hour):
    da = da.where(da["forecast_reference_time"].dt.hour == hour, drop=True)
    da = da.sel(leadtime="18h") - da.sel(leadtime="6h")
    da = da.rename(forecast_reference_time="time")
    da["time"] = pd.to_datetime(da["time"].dt.strftime("%Y-%m-%d T18"))
    return da

da = datasets["CARRA_forecast_east"]["tp"]
first_12h = subtract_and_assign_time(da, hour=0)
last_12h = subtract_and_assign_time(da, hour=12)

total = first_12h + last_12h

The code is doing the following:

  1. Separate starting times at midnight and noon
  2. Subtract values when leadtime is 18 minus 6
  3. Sum the results

Hopefully I interpreted the confluence page correctly.

I centered the results at T18, as it should be the central time. But we can change it easily.

FabioMangini commented 7 months ago

Hi @malmans2,

thank you very much for recomputing the total precipitation. I compared the output of my script with that of yours. The results are almost the same as they differ by less than 1e-4 mm. I am now more confident that I was not wrongly introducing an 1-day shift in total precipitation from CARRA. Maybe the discrepancy that I noticed before is due to the daily total precipitation measured by the weather stations being referred to the date when the daily recording ends.

FabioMangini commented 7 months ago

I would like to ask one more question. I computed the number rain-of-snow (ROS) events for each winter season (December-January-February) between December 1990 and February 2017 at each weather station and at each grid point of CARRA-West and CARRA-East. ROS events correspond to those day with mean daily temperature that are equal or higher than 0°C and daily accumulated precipitation of 1mm or more.

I would like to understand whether the number of ROS events for each winter season has increased over the period under study. I was wondering if you already had a function that can compute linear trends of individual time series. I would like to use this for the number of ROS events provided by each weather station. Additionally, would you have a similar function for maps? I would use that for the number ROS events provided by CARRA-West and CARRA-East.

malmans2 commented 7 months ago

We do have a function for linear trends:

from c3s_eqc_automatic_quality_control import diagnostics
help(diagnostics.time_weighted_linear_trend)

If the attributes are in good shape, it automatically finds the time dimension and computes trends along time.

It's used in a few notebooks:

Don't request p-value or r2 if you don't need them; they can be computationally expensive.

FabioMangini commented 7 months ago

Hi @malmans2,

Thank you very much.

malmans2 commented 6 months ago

I'm closing this just because it's easier for us to keep track of tickets. Feel free to re-open if you need further help from us.