Open rhugonnet opened 1 year ago
To clarify on the Coreg.apply
for chunks:
apply_matrix_coords
function that can transform the 4x4 matrix into a Callable
that takes in 2D coordinates as input, and outputs the affine transformation at any given pixel (like is currently done in BiasCorr
classes with fit_func
).Is that something we can do currently @erikmannerfelt?
Any thoughts @adehecq @erikmannerfelt? 😄
Wow, I'm a bit overwhelmed with the list of things left to do to have any useful feedback for the moment. It's great you made an exhaustive list of all the things to be considered for coreg. Now we need two things:
What was your idea regarding this project. Did you want to work on it in the coming days/weeks? We should try to set some blocks of time to work together on this. We can coordinate over Slack or face to face.
Yes it is a bit overwhelming like this haha, but thankfully some of these points are fairly small! :slightly_smiling_face:
We'll need more detailed lists, but I think those can live within each PR that will address a new step/feature listed above. The architecture work will have to be one big PR, as everything is inter-dependent on the structure.
With all the discussions and the work already done in #158 trying to make the structure generic and robust, I have a clear vision of the technical steps to go through to reach every point. And I don't want to wait too long and lose it! I'd be happy to write a more detailed plan, discuss it, and combine efforts if you both can block some days for it, but can otherwise push on my own in the frame of the coreg study! :wink:
My idea was to start on this full time end-of-September (for the co-registration study to reach a good stage before AGU).
One year later, and we've addressed most of the points of this (long) issue and the core coregistration module is now close to being finalized! :partying_face: :balloon:
Now that the structure, input support and modular arguments are fully consistent across all coregistration methods, we can more easily address the three remaining aspects (which are either about maintenance or adding new features):
Another point that comes to mind would be to add CoregFilter
classes to use more easily in the CoregPipeline
, but right now the user can always customize his input inlier_mask
separately to remedy that.
We currently have 25 issues open on co-registration :scream_cat:, and a lot of notes and plans for the way forward dispersed everywhere. I initially tried to organize this in a project: https://github.com/orgs/GlacioHack/projects/3/views/2, but I think that the level of detail of the issues is too inconsistent for this, so I'm writing this summary instead!
Architecture: We already did a lot in https://github.com/GlacioHack/xdem/pull/158 and https://github.com/GlacioHack/xdem/pull/329 to re-organize the coding structure to be more consistent and robust, but some points are left:
fit
andapply
, and call methods depending on Point-Point, Point-Raster or Raster-Raster support, which would also completely solve #134.Features: We've been needing for a while to have consistent:
Most of them are not too far since we introduced a consistent structure for optimizers or binning in https://github.com/GlacioHack/xdem/pull/158. It makes weights almost directly supported (but we'll need to raise proper warnings for methods that currently ignore, such as ICP), and will make plotting more easy by consistently treating the type of methods (binning, fit, or both) and the dimensionality of the variables (1D, 2D, ND), which can be re-used for
Tilt
orNuthKaab
fits in the affine functions.Tests: Some tests are still a bit old or slow, several related issues could be solved all at once:
Performance: :warning: We really need to think ahead for a structure that will allow memory-efficient computations using Dask:
rioxarray
, and https://github.com/GlacioHack/geoutils/issues/383 and #392, so not too much to think about!Coreg.fit
, as a regression requires all samples at once, it cannot be combined from solutions of different chunks (except in a blockwise way usingBlockwiseCoreg
). So it's all about thesubsample
which is fairly easy to deal with (read subsample per chunk + return concatenated vector). There's also the computation of derivatives needed, which are also straightforward (slope using overlapping chunks, bias_vars using normal chunks), see the thoughts in https://github.com/GlacioHack/xdem/issues/428#issuecomment-1707538808. Most other methodsresiduals
,plot
can be based on the same logic as they usesubsample
.Coreg.apply
, which might not be trivial. For bias corrections, the solution from the optimized function is applicable independently to every pixel given theirbias_vars
(coordinates, angle, etc), so very easy to apply to chunk. However, for affine methods, applying a 3D affine matrix in 4x4 format lazily to independent chunks won't work directly... it would also require a definition of the rotation center of the matrix, and maybe other things... Any thoughts on how to address this @erikmannerfelt? Or maybe @friedrichknuth has some insights?Bugs: Here there's a lot, but they might solve themselves (or become irrelevant) after changing architecture + tests:
423, #422, #404, #326, #232, #193
Idea of plan moving forward:
apply
, this should be adaptable down the line...Any thoughts @adehecq @erikmannerfelt? :smile: