ZikangYuan / liw_oam

[IROS 2023] A LiDAR-inertial-wheel odometry and mapping system based on BA framework.
GNU General Public License v2.0
505 stars 72 forks source link

Using an optical flow sensor instead of a wheel encoder? #18

Open Userpc1010 opened 1 month ago

Userpc1010 commented 1 month ago

I noticed that the wheel encoder only uses linear speed on one axis:

https://github.com/ZikangYuan/liw_oam/blob/85fc480c66453228e064c112dd405c07b975d200/src/lioOptimization.cpp#L1323

If i use an optical flow sensor on an rover (similar to what is shown in this video with ADNS3080 mouse sensor) for odometry,

i can obtain not only linear movement but also displacement (along two axes on a plane) as well as vertical speed from a variometer (barometer), thus obtaining a full-fledged three-dimensional velocity vector. If add them in vector wheel velocity to the calculation:

https://github.com/ZikangYuan/liw_oam/blob/85fc480c66453228e064c112dd405c07b975d200/src/lioOptimization.cpp#L1324

can it improve the lidar location estimate? Is it possible, based on this three-dimensional velocity vector, if it turns out to be accurate enough, to carry out dead reckoning at moments when the lidar does not see landmarks around it, in a planar scene?