Closed flashcp closed 1 year ago
Hi, I ran your script and also monitored the memory consumption, but the memory usage was constant in my case.
Are you using the latest version of pytlsd? Because we fixed a memory leak issue a few weeks ago, that might be the solution to your problem.
Thanks, you're right. My pytlsd was installed using pip, and I found its version to be 0.0.2. After seeing your answer, I download pytlsd repository and installed it locally. The pylstd version was upgraded to 0.0.4, and memory leak issue no longer exists. Now, gluestick can perform batch inference normally. Thanks again.
https://github.com/cvg/GlueStick/blob/40d71d5f4adc7f4fccae4cd7675a49daee3e873e/gluestick/models/wireframe.py#L119
Hello! Thanks for this impressive work. I would like to apply this work to my own code to batch infer images, but I found an issue. In GlueStick/gluestick/models/wireframe.py: Line 119, there seems to be a memory leak in the lsd method.
To verify this, I conducted a simple experiment. When the lsd method is called within a for loop, the memory usage keeps increasing, as shown in the figure. I would like to ask how to solve this problem, or if there are any alternative methods to lsd that can be used.
Looking forward to your reply!