Closed DamienGilliard closed 2 months ago
Proposed strategy for additive:
thanks for the proposal @DamienGilliard ! 🚀
my questions/doubts:
Thanks for the reply !
I share your doubts about the recursive registration, i will soon know if it works sufficiently well (computation time wise, and accuracy wise)
For the second point, the reason is that planar surfaces in the scan can be the result of different pieces combined. This means that plane segmentation is not sufficient to get the point clouds of each piece. Do you agree ? I might be missing something ;)
For the second point, the reason is that planar surfaces in the scan can be the result of different pieces combined. This means that plane segmentation is not sufficient to get the point clouds of each piece. Do you agree
Yes you are right indeed. Got no solutions for that though..
This branch has now been cleaned and contains the implementation of the initial segmentation. Branch(es) of this branch will contain the further implementation of the "semantic" segmentation.
on rougher point clouds (AKA real world point clouds), the plane segmentation leads to unsatisfactory results and will require additional steps to reach good results. Re-introducing an association distance threshold so we can only associate the closest points of the segments, instead of all points.
@9and3 add:
I just merged the main in 58daa2a (it was a bit outdated this branch for doing the wraps), I think the easiest would be to not branching out for the wraps. If ok for you I will do it here directly and wait for your PR in #33 .
Hello @DamienGilliard ,
I refactored/modified a bit the segmentation backend (mainly small fixes and decoupled some things from the segmentation function). I added two components: a) the normal estimator + b) the normal segmentation.
Pop a message when the scan segmentation is ready for review on the other PR! 👍
Et merci !!
Hello @DamienGilliard ! As discussed here's the 🥔 .. You can find an example file with internalized data I was testing here.
The metadata and wrap is there, just some work from c++ is needed.
As a note:
Let me know if you can make the associator a bit more tolerant to worse normal and less populated points as the one internalized the gh file.. thanks` 🦖
@9and3 I'm on it !
[WIP]:
Next Step:
- find a way to remove unwanted points that are now associated as a result of the new point association technique:
this point can be achieved by euclidean clustering/filtering maybe
Let me know when we can merge (I left a note for the DFMesh.hh
header to be integrated with the isPointOnFace()
method ;) )
This PR concerns the segmentation of the scan. We use the 3d model to identify the pieces we need to segment. It introduces both the cpp backend and components for: