Closed JeroenVerstraelen closed 1 month ago
configuration for credentials: https://github.com/Open-EO/openeo-geotrellis-extensions/blob/f0f81fdb504293601b4d4939cc7f1d2c3eb6c1b4/openeo-geotrellis/src/main/scala/org/openeo/geotrellis/CustomizableHttpRangeReaderProvider.scala#L44 Java system property: "http.credentials.file" File contents:
{
"services.terrascope.be": {
"username": "username",
"password": "plain_text_pw"
}
}
services.terrascope.be is the hostname that is used in asset urls for which basic authentication is required.
Update:
Running a docker command will run the process graph in the current directory and output the results in "./out" TODO: make input and output configurable. Maybe with a stub python client local 'connection' object? _EDIT: Can pass path to processgraph.json now.
Running docker image required 'sudo'. TODO: possible as local user? Output files should be written with current user permission. Seems to be root at the moment. EDIT: no sudo needed anymore.
Logs are shown in console while running, and json structured logs are written to out/openeo.log
Implementation now exists out of Dockerfile
, entrypoint.sh
and test_run_graph.py
. TODO: find a better place to store those files _EDIT: Code now here_
openeo_docker_local.zip
Nice, this seems to almost work as intended. One of the most important todo's for the deadline is documentation. The basic page for eoepca is here: https://eoepca.readthedocs.io/projects/processing/en/latest/design/processing-engine/openeo/ But it actually links to: https://github.com/Open-EO/openeo-geopyspark-driver/blob/master/README.md
Can you update the markdown to explain the current steps for running batch job in docker? (Or put it in a separate one if too much.)
EPIC: https://github.com/Open-EO/openeo-geopyspark-driver/issues/806