yuefanhao / SuperPoint-SuperGlue-TensorRT

SuperPoint and SuperGlue with TensorRT. Deploy with C++.
Apache License 2.0
233 stars 44 forks source link

The memory usage keeps increasing during infer process #36

Closed vickersmt closed 1 month ago

vickersmt commented 1 month ago

Thank you very much for your work. I tried to deploy this model on NX and used my own dataset. During the run, I found that the memory usage of NX kept increasing until it reached the upper limit and the program died directly. I tried running the model using the built-in inference_image.cpp and changed the number of loops to 10000. During the operation, the memory will also continue to increase, from approximately 4GB to 5.3GB. Can you solve this problem? It may be due to the memory not being released during the inference process?

vickersmt commented 1 month ago

I have already resolved this issue, the problem lies in src/superglue.cpp in SuperGlue::process_output function, there is a "auto scores = new float[(scores_map_h + 1) (scores_map_w + 1)];" just need to delete it or note it