eurodatacube / eodash

Software behind the RACE dashboard by ESA and the European Commission (https://race.esa.int), the Green Transition Information Factory - GTIF (https://gtif.esa.int), as well as the Earth Observing Dashboard by NASA, ESA, and JAXA (https://eodashboard.org)
https://race.esa.int
MIT License
91 stars 42 forks source link

Add BOKU drought vulnerability & crop-growth potential to GTIF AT #2471

Open bschumac opened 7 months ago

bschumac commented 7 months ago

EODC was tasked to suggest this new datasets as new layers in GTIF:

Drought vulnerability

Crop growth potential

Find the descriptions of the layers in the STAC descriptions. Please note that BOKU requested to add explicitly a contact field. For Intra-field crop growth potential it is Contact: francesco.vuolo@boku.ac.at or clement.atzberger@boku.ac.at and for Drought vulnerability it is Contact: clement.atzberger@boku.ac.at

Please also include the BOKU logo at the end of the description page.

The layer stylings:

Drought vulnerability(0-5) grafik

Intra-field crop growth potential (0-254) grafik

Patrick1G commented 6 months ago

To be integrated as:

@santilland please make both available as Background Layers for all energy tools..

@bschumac we would need BOKU to fill the data descriptor template for both layers, see attached Product_data_descriptors.docx

santilland commented 6 months ago

Hello @bschumac, thank you for the information and ticket. It seems that the referenced datasets do not allow cross origin requests, so it would mean we would need to copy the data to a new location. Would it be possible for you to set the header to allow cross origin requests?

senmao commented 6 months ago

Hi @santilland we have enabled the CORS for GTIF website as shown below. please try it again and let us know if it works for you.

access-control-allow-methods: GET access-control-allow-origin: https://gtif.esa.int

santilland commented 6 months ago

Hello @senmao, thanks for adding CORS headers for GTIF. CC @bschumac : I tried to integrate the data now into the client, and was not getting it to work, after further inspection it seems that the files are not valid Cloud optimized Geotiffs (COG) as they have no overviews (have tiles). Testing with rio-cogeo validate:

The file is greater than 512xH or 512xW, but is not tiled
eodc/201210421_yield_potential_2017_2020_v8.tif is NOT a valid cloud optimized GeoTIFF
santilland commented 6 months ago

@bschumac @senmao Trying to load the drought-vulnerability-2003-2018 data (which is small enough to not need tiles/overviews, but the requests fail. When doing the request i get a 416 response error with the message "detail: "range exceeded file end".

If i download the file and host it locally everything works as expected.

Here is a screenshot of the response headers from your server: image here are response headers from serving the same file from a dev server locally: image

If i interpret things right it seems that the content length is reported incorrectly on your server - content-length 34 in comparison with content length 42407 from the dev server. The file is 42K big.

Here is an example curl request: curl -r 0-65536 https://data.eodc.eu/collections/GTIF/MCD13A2.Y20032018.006.globalV1.1_km_10_days_NDVI.O4.VCI.lethr35_1327_ge09_AT.tif --output test Looking into the output you can see it shows the range request exceeds file end.

Maybe allowing range requests needs to be enabled for the server? I can't quite understand how the endpoint is configured.

senmao commented 6 months ago

@santilland The range request is already supported by our server. As sending the HEAD request to the data, you can get the content-length correctly 42407 as shown below. image

So the requesting rang should not exceed the content length. I don't know why you got the content-length 34, it would be helpful if you can show the header of request, not only of the response.

senmao commented 6 months ago

Hello @senmao, thanks for adding CORS headers for GTIF. CC @bschumac : I tried to integrate the data now into the client, and was not getting it to work, after further inspection it seems that the files are not valid Cloud optimized Geotiffs (COG) as they have no overviews (have tiles). Testing with rio-cogeo validate:

The file is greater than 512xH or 512xW, but is not tiled
eodc/201210421_yield_potential_2017_2020_v8.tif is NOT a valid cloud optimized GeoTIFF

For the file format, I need to check with my colleague who prepared this this data.

santilland commented 6 months ago

thank you for the fast responses!

So the requesting rang should not exceed the content length. I don't know why you got the content-length 34, it would be helpful if you can show the header of request, not only of the response.

I just realized it also says application/json in the response headers, so the size is already the one of the error json being returned if an out of range request is done. So i guess the client for loading geotiffs has a minimum block size it requests? that goes over the limit, and some servers "overlook" to large requests and just return whatever is there and others respond with an error depending on configuration. Could that be it?

senmao commented 6 months ago

@santilland yes, that can one possible reason. however, in order to find out the root cause it would be good to have clear understanding how the frontside communicating with our backend. So if you can monitor all the network traffic to our server and note down the request header sent from frontend and the response header which returned from the server, we can know what causes the issue.

In general, I normally first send HEAD request to server to get header information. Then I can know if it supports range request and the content-length.

santilland commented 6 months ago

We are using OpenLayers Geotiff source to load and visualize the data, in the docs i see that the default blocksize is the one that goes over the current file size: https://openlayers.org/en/latest/apidoc/module-ol_source_GeoTIFF.html#~GeoTIFFSourceOptions (defaults to 65536) I have been experimenting with changing the block size to be smaller, but it somehow decides to only fetch one part and then stops... if i try to use the same block size as content size i get a strange NS_ERROR_NET_PARTIAL_TRANSFER error. Sometimes it sends an OPTIONS request before but id don't know exactly when, with that one i have issues with cors testing locally "CORS Preflight Did Not Succeed". It does not first seem to send a HEAD request in any case, here is how it tries to load the data if i use the expected file size: image Not sure how to tackle this.

senmao commented 6 months ago

@santilland ok, to make things easy, I reset then range end to the file size on the server side when the range end exceeds the file size. so now it works if you send a range request of which the range end is larger than file size.

image

santilland commented 6 months ago

Great! Thank you @senmao that solves the issue and allows to load the data for the drought vulnerability

image

bschumac commented 5 months ago

Product_data_descriptors_BOKU_crop_growth_potential.docx Product_data_descriptors_BOKU_drought-vulnerability.docx

Here are the two data descriptors for the two requested datasets. Others will follow on 25/03/24.