Open atsiflis opened 3 years ago
Hello @atsiflis!
We're happy the library has been useful to you :-)
I agree that it may seem like a memory leak. As far as I know (and remember), the evaluation of a spline should not result in any allocation of memory (that is not also freed before returning), except for the data structure to hold the result. However, the Python interface makes reasoning about this a bit harder, as it uses a garbage collector which may run at (seemingly) random times.
Ideally, the memory usage of this should converge to some constant:
while True:
y = bspline.eval(x)
I would have to have a more thorough look to determine if this is actually a memory leak, or if this is just Python playing us a trick.
Thank you for the report and the example, it will be very useful for debugging purposes!
I also meet the same problem. My dataset is large,it will occupy all the memory of my computer, and report an error. I still have a problem and do not know how to solve it. I use python, if I use for loop + splinter, its time cost is mainly in a large number of for loops, it will take more time than scipy( because scipy uses matrix processing), how to solve this problem, thanks
I can't solve this problem, but I think I can alleviate it.
The cinterface.h file contains a function SPLINTER_API void splinter_datatable_delete(splinter_obj_ptr datatable_ptr);
But this function is not used in codes, So in the datatable.py I add this function:
def __del__(self):
if self.__handle is not None:
splinter._call(splinter._get_handle().splinter_datatable_delete, self.__handle)
self.__handle = None
After that, still have a memory leak problem, but the leak speed is very slow. Thanks
Hi, @yj-Roy. Good catch! I have not investigated this problem, but it seems reasonable to implement the del method for all the Python classes (not only DataTable) to prevent memory leakage. It would be great if you could prepare and submit a pull request so that we can fix the problem for all users.
Bjarne
Hi,
Thank you for making this very useful library available.
I am using it in a Python script that builds a spline once and then evaluates it multiple times. I have noticed that the memory allocation of the Python process increases by a few MB after each evaluation and I am considering whether this is due to some memory leak in splinter. To investigate this, I run the following simplified example (adapted from the basic usage example)
from which I get
Given that the call to bspline.eval() increased the memory allocated to the process by 73728 bytes but the size of the reachable objects increased by only 444 bytes, is it right to conclude that there is a memory leak in splinter?