Open YangSiri opened 1 year ago
It just give a weight for different planes. We want the planes with large eigenvalue (\lambda_3) have smaller weight. It does matter you set the "coe" as "1" and it can still work.
Thanks, @Zale-Liu
39300.2 3184.08 585.026 47.096 100.483 259.047 3184.08 20649.1 204.423 -38.4654 157.158 -3.93334 585.026 204.423 5030.36 -68.7984 140.476 -68.8715 47.096 -38.4654 -68.7984 7.60952 -6.97505 2.29398 100.483 157.158 140.476 -6.97505 32.1939 2.27241 259.047 -3.93334 -68.8715 2.29398 2.27241 443.363
When i performed the optimization, this is part of the Hessian matrix from the above code. It can be seen that there are very large values in the rotation part of one pose compared with translation ones. Is this normal? I wonder how you set the information matrix between poses in your hierarchical BA work? Could you please give me some examples ?
Would this be a sign for the unsatisfied BA result?
https://github.com/hku-mars/BALM/blob/d91afa959f8dfad8d6e354013b9bd9048f41e597/src/benchmark/bavoxel.hpp#L39
Hi, @Zale-Liu
May i ask how to understand the meaning of layer in eigen_value_array? Is it the layer of octree or the layer for hierarchical BA?
What's the meaning of above codes?
Looking forward to your help ;)