Closed automataIA closed 8 months ago
If I remember correctly the author of that notebook is @ValentinaHutter, could you please have a look at this?
Found the solution for a part of the problem! The current library has a different layout of module functions than the one written in the notebook. this is the new import modules code:
from openeo_processes_dask.process_implementations import apply, ndvi, multiply, load_stac, core #, save_result
from openeo_processes_dask.process_implementations.core import process
the problem that remains is that it still can't find the save_result
process
A former colleague of mine created this example notebook - good to see that you already figured it out!
The save_result
process is not available in this repository. This is because the process itself can be backend specific, depending on which formats the backends support, where the results should be stored (filepaths, filenames) and also what kind of input data the backend provided. (For specific CRS you might want a specific grid and naming convention.) Therefore, we did not include a general implementation here.
In your notebook, you could implement save_result by using
data.to_netcdf(<filename>)
A former colleague of mine created this example notebook - good to see that you already figured it out!
The
save_result
process is not available in this repository. This is because the process itself can be backend specific, depending on which formats the backends support, where the results should be stored (filepaths, filenames) and also what kind of input data the backend provided. (For specific CRS you might want a specific grid and naming convention.) Therefore, we did not include a general implementation here.In your notebook, you could implement save_result by using
data.to_netcdf(<filename>)
Thanks for your answer. I'm actually still trying to understand in general what the parts of the backend are (particularly python) and how they interact with each other. :cry:
This repository as well as the openeo-pg-parser-networkx
are the two parts that should handle the incoming process_graphs - see https://github.com/Open-EO/openeo-pg-parser-networkx/blob/main/README.md - the repositories do not include any code to load or to save any backend specific data. So, for the backend, you would need another repository where you implement your version of load_collection and save_result. As their specifications are available in openeo-processes, you can add them to a process registry. :)
Also you can read more about process graphs in the API specification https://api.openeo.org/#section/Processes
For save_result, you could start from my implementation here: https://github.com/SARScripts/openeo_odc_driver/blob/dask_processes/openeo_odc_driver/processing.py
There's also load_collection, using opendatacube, but if you're using load_stac you don't need it.
Using the notebook
01_minibackend_demo.ipynb
I tried to import the modules:It tells me that they aren't there, in fact I can't find them among those available with the
dir
command. I tried doing various types of installation among those available, installing them and also testing the simple version, but with the same results:this is the structure of the files and directory of the openeo-processes-dask library, present in the directory of my venv
~/.pyenv/versions/3.9.18/envs/backend/lib/python3.9/site-packages/openeo_processes_dask