Open-EO / openeo-python-client

Python client API for OpenEO
https://open-eo.github.io/openeo-python-client/
Apache License 2.0
146 stars 37 forks source link

load_uploaded_files not implemented? #177

Closed kempenep closed 1 year ago

kempenep commented 3 years ago

I am looking for the process load_uploaded_files, see specs. Has this been implemented in the openeo-python-client?

I also noted that when using:

polygonal_mean_timeseries('path')

the process graph is calling read_vector. This process_id is not listed in the specs, should this be load_uploaded_files instead?

soxofaan commented 3 years ago

load_uploaded_files is indeed not implemented as a method or function in the python client yet. You can however build a cube using the generic datacube_from_process method. Something like this:

cube = con.datacube_from_process("load_uploaded_files", paths=["/path/to/....", "..."], format="...")

The read_vector in polygonal_mean_timeseries is a VITO specific process at the moment. See #104

kempenep commented 3 years ago

Thanks for the hint. The function polygonal_mean_timeseries('path'), would suit my needs and I can implement a read_vector in our back-end. However, as long as read_vector is not a part of the specs, the graph parser from @lforesta will complain (or I would have to include a proprietary definition in our own specs).

As for building a cube using the generic datacube_from_process, is it possible to load a vector? When I try to load a GeoJSON, I get:

UserWarning: No cube:dimensions metadata
  complain("No cube:dimensions metadata")

How would it be combined with the polygonal_mean_timeseries? Or is there a more generic way to call the aggregate_spatial function?

soxofaan commented 3 years ago

that "no cube:dimension metadata" is just a warning, it shouldn't block you. Can you provide a bit more source code? It's a bit hard to guess what you are trying to do.

kempenep commented 3 years ago

Sure, I want a simple aggregate_spatial from a datacube, e.g., ndvi obtained via:

datacube = connection.load_collection("EarthObservation.Copernicus.S2.scenes.source.L2A")
datacube = datacube.filter_bbox(west = 5.251809, south = 51.705835, east = 5.462144, north = 51.838069)
datacube = datacube.filter_temporal(start_date="2020-01-01", end_date="2020-03-01")
datacube = datacube.filter_bands(["B4", "B8"])
ndvi = datacube.ndvi(nir = 'B8', red = 'B4')

I have tried this:

ndvi = ndvi.aggregate_spatial('/path/to/vector.json', 'median')

But this calls the read_vector

From your hint, I figured the vector could be taken as an uploaded cube (?):

ndvi = ndvi.aggregate_spatial(con.datacube_from_process("load_uploaded_files", paths=["/path/to/....", "..."], format="..."), 'median')

But then the cube is not supported in this process, as geometries are expected (error message):

OpenEoClientException: Invalid `geometries`: <openeo.rest.datacube.DataCube object at 0x7f4058cd44f0>
lforesta commented 3 years ago

However, as long as read_vector is not a part of the specs, the graph parser from @lforesta will complain (or I would have to include a proprietary definition in our own specs).

Indeed you can add a custom predefined process to your back-end. Else the validation step will complain since that process does not appear in your supported processes .

soxofaan commented 1 year ago

load_uploaded_files support was added in b1b54744e2c3511124b2163d543e4161966b9691

load_geojson and load_url have been added too under #424/#457