Open jdries opened 5 months ago
Change the computation of LayoutDefinition in RegridFixed, to shift the extent by the required amount so that the origin ends at the corner of a window rather than at the corner of the full chunk.
Problem is: how do we know for sure if the extent was already buffered in the first place, to accomodate this? Also when we undo the regridding, we'll have to take this into account?
Perhaps we need to store the applied buffer in the datacube metadata somewhere?
Already apply the buffering in load_collection. This would then be a 'push down' of apply_neighborhood parameters. This solution will not work if other complex processes are in between. It also needs to play well with resample processes: it can only work in the case where resampling parameters are also pushed down.
apply_neighborhood sets pixel_buffer in load_collection to load extra data at the edges.
However, the retiling code first performs retiling, and then buffers the tiles. The consequence being that edge tiles still get padded with nodata.
https://github.com/Open-EO/openeo-geotrellis-extensions/blob/292afa02812903dd9c6f2b2ff0520671f6e5a825/openeo-geotrellis/src/main/scala/org/openeo/geotrellis/OpenEOProcesses.scala#L949
The effect is also visible here in this image, which was produced with a window size of 48 and 8 pixel overlap: