MarineRoboticsGroup / NF-iSAM

The codebase of normalizing flows for incremental soomthing and mapping (NF-iSAM). The datasets used for performance evaluation are provided as well.
MIT License
22 stars 7 forks source link

High memory consumption #3

Open tipf opened 2 years ago

tipf commented 2 years ago

Hi Qiangqiang, I recognized another problem with my range-only SLAM example range_only_incremental.py: Solving the 100-pose dataset requires about 12 GB of RAM and solving a dataset with 3500 poses is not possible on a machine with 32 GB RAM.

Is the high memory consumption expected or could be something wrong with my script?

Best Regards Tim

doublestrong commented 2 years ago

Hi Tim,

You can try to reduce the posterior_sample_num for a lower memory consumption although the resulting samples will look sparse. Figures will still be saved if you set the show_plot to False.

I didn't see issues in your script but 12 GB does sound a lot for a 100-pose problem in my experience. The high memory consumption is an expected behavior for larger-dimensional problems since we are drawing samples from the (high-dimensional) joint posterior distribution. There could be more efficient implementation strategies to reduce the memory cost; however, we haven't apply them in the current code.

doublestrong commented 2 years ago

oh, one thing that might be helpful is that you can remove the sampling steps from incremental_inference if you don't want samples for every incremental update: https://github.com/tipf/NF-iSAM/blob/035122e7556197aa3c26cb53d0c0cf47818b2327/src/slam/FactorGraphSolver.py#L388-L392 This will save you both runtime and memory consumption and won't affect results (learned normalizing flows). You can call sample_posterior to draw samples when needed.

I guess it is not necessary to do incremental_inference every time step for your purpose. If you move the following lines to the if n % block, then the incremental updates will be performed for every 10% of the dataset and for the last time step. This will reduce both the overall runtime and memory consumption: https://github.com/tipf/NF-iSAM/blob/035122e7556197aa3c26cb53d0c0cf47818b2327/example/slam/MUSE/range_only_incremental.py#L167-L173 If you want to get the solution for the final time step ASAP, you can even remove n % (NumTime/10) == 0 or.

tipf commented 2 years ago

Thanks for your hints, Qiangqiang! I managed to reduce the memory consumption to 5.6 GB, which is still a lot but seems more reasonable.

Right now, I'm not sure what causes the high memory consumption. Using 500 samples of 100 variables with 3 dimensions, I would expect just about 1.2 MB (assuming 64 bit double values). I will check this further and report back if I find an answer...

tipf commented 2 years ago

I run a memory profiler and about 5 GB of the memory consumption comes from the line where Graph.fit_tree_density_models is called: range_only_incremental.py#L172

Here is the full profiling result: Profile.txt

Right now, I don't have the time to go deeper, but I will come back to this in the future.