introlab / rtabmap

RTAB-Map library and standalone application
https://introlab.github.io/rtabmap
Other
2.74k stars 779 forks source link

Reduce the used memory for big map #1217

Open hellovuong opened 7 months ago

hellovuong commented 7 months ago

Hello @matlabbe, We created a pretty good map of a large area (multi-session mapping: 5). It ended up with >4 million words in the vocabulary, which led to rtabmap crashing every time It ran in localization mode for minutes or when I called the backup service explicitly. I am aware that you already opened issue #1201, however, I am wondering if any intermediate step to reduce the size of words? Some parameters that may related for you to check:

Kp/MaxFeatures 1500
Vis/MaxFeatures 1000
FeatureType: GFTT/BRIEF

Let me know if I can provide more information to help you with support.

Thank you for your contribution and hope that will receive your reply soon.

alexk1976 commented 7 months ago

Also waiting for this critical feature. Cannot load full vocabulary due to huge memory needs to allocate and dont manage to get same accuracy while activating WM/LTM.

hellovuong commented 7 months ago

@alexk1976 You can try to use kp/NNStrategy to 0 or 2. It doesn't need to uncompress the descriptor from bin to float to build vocabulary, this saves 50% of memory usage, and it goes with the reduced perform of Near Neighbor search of course. However, the need of this feature is still necessary.

matlabbe commented 7 months ago

Here some options:

hellovuong commented 7 months ago

Thank you!

alexk1976 commented 7 months ago

i dont think it's a real solution..a bit bigger area like we have and its impossible to load dictionary even when we set MaxFeatures=500. If we dont use FlanTree - have performance issues. Fixed dictionary - gives worse accuracy. We need a way to load full graph and only part of the dictionary

matlabbe commented 7 months ago

@alexk1976 Agreed, it is kinda included in that other issue https://github.com/introlab/rtabmap/issues/1201 .