NASA-IMPACT / VEDA

Overview of the VEDA Open Source project
https://docs.openveda.cloud/
Apache License 2.0
13 stars 0 forks source link

Accessing data from VEDA via WMS #1

Closed billyz313 closed 8 months ago

billyz313 commented 8 months ago

Hi,

I was wondering if it is possible to access data that is in VEDA via wms calls. For example i see https://www.earthdata.nasa.gov/dashboard/data-catalog/global-reanalysis-da in the data catalog. How would I make a getCapabilities call for this data so I could display it on a leaflet js map.

j08lue commented 8 months ago

VEDA data services currently do not provide WMS, but WMTS and XYZ.

We have a function in our data explorer that tells you the WMTS URL for the layer you are seeing on the map:

image

You should be able to copy-paste the XYZ layer URL, for example, directly into Leaflet: https://leafletjs.com/reference.html#tilelayer

Please let us know whether this works for you.

billyz313 commented 8 months ago

@j08lue Thank you, this is great. I am noticing all of the time steps in the data have different IDs. Is there a request I can make to get all the IDs available from a dataset? For example the link u shared was showing Evapotranspiration, I would like to make an ajax call that would get the IDs and dates for the entire dataset so I can show them in a timeline. Is that possible? I can also use python to make a request if that is needed as my application runs on Django.

PS xyz tiles are perfect. I had already tested some from ur data catalog that work perfectly, I was just unsure how to programmatically generate these. (I was manually pulling then from the browser network call and replacing the xyz with the variables...)

j08lue commented 8 months ago

Yeah, unfortunately neither XYZ nor WMTS have time series capabilities (something we hope to influence during our project here), so you will have to ask our catalog (STAC) for the available time steps and construct the URLs from those. That is what our frontend does.

You can see how to do that in Python (it's actually mostly REST API calls) in the example notebooks in our docs:

  1. This one uses filters and virtual mosaics: https://nasa-impact.github.io/veda-docs/notebooks/quickstarts/hls-visualization.html
  2. A simpler case is perhaps this one that queries STAC and then queries the /stac/tilejson.json endpoint with the STAC collection and item IDs (f"{RASTER_API_URL}/stac/tilejson.json?collection={items[2]['collection']}&item={items[2]['id']}"): https://nasa-impact.github.io/veda-docs/notebooks/datasets/ocean-npp-timeseries-analysis.html#visualizing-the-raster-imagery, which returns the XYZ layer URL.
billyz313 commented 8 months ago

Thank you, that helped a great deal. I created a python function to generate the base url and get the beginning end end time for the dataset which I pass to my client code:

def generate_tile_urls(dataset):
    STAC_API_URL = "https://staging-stac.delta-backend.com"
    RASTER_API_URL = "https://staging-raster.delta-backend.com"
    collection_name = "lis-global-da-evap"
    collection = requests.get(f"{STAC_API_URL}/collections/{collection_name}").json()

    response = requests.post(f"{STAC_API_URL}/search", json={"collections": [collection_name],
        "query": {"datetime": {"eq": collection.get("extent").get("temporal").get("interval")[0][0]}},
        "limit": 1, }, ).json()
    items = response["features"]
    item = items[0]
    item_stats = item['assets']['cog_default']['raster:bands'][0]['statistics']
    rescale_values = item_stats['minimum'], item_stats['maximum']

    tiles = requests.get(f"{RASTER_API_URL}/stac/tilejson.json?collection={item['collection']}&item={item['id']}"
                         "&assets=cog_default"
                         "&color_formula=gamma+r+1.05&colormap_name=rdbu_r"
                         f"&rescale={rescale_values[0]},{rescale_values[1]}", ).json()

    base_url = tiles['tiles'][0].replace(
        collection.get("extent").get("temporal").get("interval")[0][0].split(" ")[0].replace("-", "") + "0000",
        "{date}")

    start_date = datetime.strptime(collection.get("extent").get("temporal").get("interval")[0][0],
                                   '%Y-%m-%d %H:%M:%S+%f')
    end_date = datetime.strptime(collection.get("extent").get("temporal").get("interval")[0][1], '%Y-%m-%d %H:%M:%S+%f')

    return {"base_url": base_url, "start_date": start_date.strftime('%Y%m%d%H%M'),
            "end_date": end_date.strftime('%Y%m%d%H%M')}
j08lue commented 8 months ago

I added the instructions I gave above to our VEDA Docs.

https://nasa-impact.github.io/veda-docs/pr-preview/pr-131/services/apis.html#using-tile-layers-in-external-services

Please close this issue, if it is now complete, @billyz313.

(Side note - I assume you had not found our docs before you asked - probably an indication that we need to make them more prominent.)

billyz313 commented 8 months ago

Thank you, I had not found these docs. My next task is to make an API call to get timeseries data from a dataset based on a polygon and date range, the docs u just shared seem like they will be very helpful.