Closed claxn closed 3 years ago
The new bug_fixes branch does not work for me. For any graph I want to process, I get an error:
openeo_pg_parser/utils.py", line 266, in set_obj_elem_from_keys
obj[keys[0]] = value
IndexError: list index out of range
Is there something I should change or install wrt to the previous version in the master branch?
It doesn't work for me, for every process graph I get many KeyError messages.
For example, the D22 EVI graph fails at the parsing step with KeyError: "'mintime_11_out' is not a valid key."
{
"process_graph": {
"dc": {
"process_id": "load_collection",
"description": "Loading the data; The order of the specified bands is important for the following reduce operation.",
"arguments": {
"id": "openEO_S2_32632_10m_L1C_D22_test",
"spatial_extent": {
"west":11.279182434082033,
"south":46.464349400461145,
"east":11.406898498535158,
"north":46.522729291844286
},
"temporal_extent": [
"2018-06-14",
"2018-06-23"
],
"bands": [
"B08",
"B04",
"B02"
]
}
},
"evi": {
"process_id": "reduce_dimension",
"arguments": {
"data": {
"from_node": "dc"
},
"reducer": {
"process_graph": {
"nir": {
"process_id": "array_element",
"arguments": {
"data": {
"from_parameter": "data"
},
"index": 0
}
},
"sub": {
"process_id": "subtract",
"arguments": {
"x": {
"from_node": "nir"
},
"y": {
"from_node": "red"
}
}
},
"div": {
"process_id": "divide",
"arguments": {
"x": {
"from_node": "sub"
},
"y": {
"from_node": "sum"
}
}
},
"p3": {
"process_id": "multiply",
"arguments": {
"x": 2.5,
"y": {
"from_node": "div"
}
},
"result": true
},
"sum": {
"process_id": "sum",
"arguments": {
"data": [
1,
{
"from_node": "p1"
},
{
"from_node": "p2"
},
{
"from_node": "nir"
}
]
}
},
"red": {
"process_id": "array_element",
"arguments": {
"data": {
"from_parameter": "data"
},
"index": 1
}
},
"p1": {
"process_id": "multiply",
"arguments": {
"x": 6,
"y": {
"from_node": "red"
}
}
},
"blue": {
"process_id": "array_element",
"arguments": {
"data": {
"from_parameter": "data"
},
"index": 2
}
},
"p2": {
"process_id": "multiply",
"arguments": {
"x": -7.5,
"y": {
"from_node": "blue"
}
}
}
}
},
"dimension": "bands"
}
},
"mintime": {
"process_id": "reduce_dimension",
"description": "Compute a minimum time composite by reducing the temporal dimension",
"arguments": {
"data": {
"from_node": "evi"
},
"dimension": "temporal",
"reducer": {
"process_graph": {
"min": {
"process_id": "min",
"arguments": {
"data": {
"from_parameter": "data"
}
},
"result": true
}
}
}
}
},
"save": {
"process_id": "save_result",
"arguments": {
"data": {
"from_node": "mintime"
},
"format": "GTiff"
},
"result": true
}
}
}
@clausmichele interesting, I can parse correctly this EVI PG (which seems to be identical to the one you copy-pasted). I use the following cmd:
from openeo_pg_parser.translate import translate_process_graph
graph = translate_process_graph(process_graph_json, process_defs=process_defs).sort(by='dependency')
@clausmichele interesting, I can parse correctly this EVI PG (which seems to be identical to the one you copy-pasted). I use the following cmd:
from openeo_pg_parser.translate import translate_process_graph graph = translate_process_graph(process_graph_json, process_defs=process_defs).sort(by='dependency')
For your process graph in my case it returns the same error:
KeyError: "'mintime_11_out' is not a valid key."
Anyway, I've installed this bug_fixes branch in this way:
git clone https://github.com/Open-EO/openeo-pg-parser-python.git --branch bug_fixes
cd openeo-pg-parser-python
python setup.py install
@clausmichele so you don't install the library in a dedicated environment? Maybe you already have a previous version that clashes, could you try with a pipenv/conda environment?
Yes I'm using conda and I deleted the previous version. Anyway, I've just created a new env with
conda create -n parser_test python=3.6
reinstalled the parser and run the code:
from openeo_pg_parser.translate import translate_process_graph
graph = translate_process_graph('./process_graphs/EVI_EODC.json').sort(by='dependency')
where EVI_EODC.json is this.
I get the same error. Please note that I can't set process_defs=process_defs
since I do not know where to get this variable.
Please note that I can't set process_defs=process_defs since I do not know where to get this variable.
The description of that var is here, line 269.
Could you try with process_defs='https://openeo.eodc.eu/v1.0/processes'
? @claxn what happens when this var is None? No validation is done to check if the processes used in the process graph are present?
With var=None it reads the processes in ./processes. I'm pretty sure about this, since I always left it None and added new json processes there (as I did for the last pull request with climatological_normal and anomaly).
Edit: setting process_defs='https://openeo.eodc.eu/v1.0/processes'
doesn't change, error persist.
I have the same error as @clausmichele on the JEODPP platform
When checking the differences between the bug_fixes branch and the master branch, I noticed that bug_fixes does not seem to be able to decode the from_parameter in the reduce_temporal
? Printing out the parsed node, it seems the from_argument
remains data
, whereas in master it gets parsed into from_node: evi_2
.
parsing with bug_fixes branch:
Node ID: min_13
Node Name: min
{'arguments': {'data': {'from_parameter': 'data'}},
'process_id': 'min',
'result': True}
parsing with master branch:
Node ID: min_13
Node Name: min
{'arguments': {'data': {'from_node': 'evi_2'}},
'process_id': 'min',
'result': True}
Hopefully this helps.
@kempenep @lforesta @clausmichele : First, thanks for testing and sorry for my late reply! There was a minor bug in the new sorting algorithm. I have fixed it and now the EVI graph from @clausmichele works for me. Could you please try again? @kempenep : regarding the 'from_parameter' behaviour everything is correct from my point of view, since in https://github.com/Open-EO/openeo-pg-parser-python/issues/24 and in https://github.com/Open-EO/openeo-pg-parser-python/issues/25 we agreed to not define any 'from_parameter' behaviour in the parser. I will extend the documentation today, so that you see how the parser can help you to find all input data or process nodes.
Now it works for me too, I'll wait for the new documentation to fully understand the changes from the previous version.
I have updated and extended the documentation inside the Jupyter Notebook and HTML file. Please check if everything is understandable and works properly.
@claxn for completeness it would be nice to have an example of the use of has_descendant_process
in the notebook
All working for me now. +1 for a merge into master
@clausmichele any other open issue that should be addressed in this PR? Else, @claxn could you merge?
for me it's also ok!
@clausmichele @lforesta : Please test the new version if it works for you. The behaviour of a few functions has changed and a few new ones have been added, so that you get every information you need at a node, hopefully. Please note that the sorting behaviour is also now a bit different, actually being now the right order of process calls. I will update the documentation of the README, Jupiter Notebook, CHANGELOG, etc. within the next days.
If you have any ideas or requests for improving or adding missing functionality, please let me know.