DeutscherWetterdienst / downloader

A simple python module and command line tool to download NWP GRIB2 data from DWD's Open Data File Server https://opendata.dwd.de.
Apache License 2.0
19 stars 3 forks source link

Download of ICON-EPS data for the relhum_2m variable fails #1

Closed pjpetersik closed 3 years ago

pjpetersik commented 4 years ago

I tried to download ICON-EPS data for the 'relhum_2m' field with the following command

docker run --rm \
  --volume $(pwd):/mydata \
  deutscherwetterdienst/downloader downloader \
    --model icon-eps \
    --timestamp '2020-09-29 00:00:00' \
    --single-level-fields relhum_2m \
    --max-time-step 12 \
    --directory /mydata 

The file icon-eps_global_icosahedral_single-level_2020092900_000_relhum_2m.grib2 is downloaded successfully. However, it seems that the downloader afterwards tries to download the file icon-eps_global_icosahedral_single-level_2020092900_001_relhum_2m.grib2(1 hour lead time). However, the relhum_2m forecasts are only available with a 6 hour interval on the Open Data Server of the DWD (https://opendata.dwd.de/weather/nwp/icon-eps/grib/00/relhum_2m/)

docker run --rm \
>   --volume $(pwd):/mydata \
>   deutscherwetterdienst/downloader downloader \
>     --model icon-eps \
>     --timestamp '2020-09-29 00:00:00' \
>     --single-level-fields relhum_2m \
>     --max-time-step 12 \
>     --directory /mydata 
[downloader_cli.py:252 -             download() ] 
---------------
Model: icon-eps
Grid: icosahedral
Fields: relhum_2m
Minimum time step: 0
Maximum time step: 12
Timestamp: 2020-09-29
Model run: 00
Destination: /mydata
---------------

[downloader_cli.py:108 - downloadAndExtractBz2FileFromUrl() ] downloading file: 'https://opendata.dwd.de/weather/nwp/icon-eps/grib/00/relhum_2m/icon-eps_global_icosahedral_single-level_2020092900_000_relhum_2m.grib2.bz2'
[downloader_cli.py:121 - downloadAndExtractBz2FileFromUrl() ] saving file as: '/mydata/icon-eps_global_icosahedral_single-level_2020092900_000_relhum_2m.grib2'
[downloader_cli.py:124 - downloadAndExtractBz2FileFromUrl() ] Done.
[downloader_cli.py:108 - downloadAndExtractBz2FileFromUrl() ] downloading file: 'https://opendata.dwd.de/weather/nwp/icon-eps/grib/00/relhum_2m/icon-eps_global_icosahedral_single-level_2020092900_001_relhum_2m.grib2.bz2'
Traceback (most recent call last):
  File "/usr/local/bin/downloader", line 8, in <module>
    sys.exit(download())
  File "/usr/local/lib/python3.7/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.7/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/downloader/downloader_cli.py", line 260, in download
    destFilePath=directory )
  File "/usr/local/lib/python3.7/site-packages/downloader/downloader_cli.py", line 163, in downloadGribDataSequence
    downloadGribData(model=model, grid=grid, param=field, timestep=timestep, timestamp=timestamp, destFilePath=destFilePath)
  File "/usr/local/lib/python3.7/site-packages/downloader/downloader_cli.py", line 152, in downloadGribData
    downloadAndExtractBz2FileFromUrl(dataUrl, destFilePath=destFilePath, destFileName=destFileName)
  File "/usr/local/lib/python3.7/site-packages/downloader/downloader_cli.py", line 117, in downloadAndExtractBz2FileFromUrl
    resource = urllib.request.urlopen(url)
  File "/usr/local/lib/python3.7/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/local/lib/python3.7/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/usr/local/lib/python3.7/urllib/request.py", line 641, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/local/lib/python3.7/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/usr/local/lib/python3.7/urllib/request.py", line 503, in _call_chain
    result = func(*args)
  File "/usr/local/lib/python3.7/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 404: Not Found
bjoern-reetz commented 3 years ago

Hello @pjpetersik,

you have perfectly identified the problem in the code: downloader is not aware of the different time steps for every parameter. While this would be a great user experience, I am sorry to tell you that we currently are lacking the staff to improve on downloader at that scale. But I thought of kind of a quick fix to mitigate this missing feature and maybe I can integrate this into the source code and container image soon..

bjoern-reetz commented 3 years ago

(sorry, closed issue by mistake)

bjoern-reetz commented 3 years ago

I just added a new optional parameter --time-step-interval and released verison 0.2.0. This is not the most user friendly way, but now you can finally download a bunch of relhum_2m!

docker run --rm \
  --volume $(pwd):/mydata \
  deutscherwetterdienst/downloader downloader \
    --model icon-eps \
    --timestamp '2021-07-13 00:00:00' \
    --single-level-fields relhum_2m \
    --max-time-step 12 \
    --time-step-interval 6 \
    --directory /mydata 
pjpetersik commented 3 years ago

Great. Thanks @bjoern-reetz.