wannesm / dtaidistance

Time series distances: Dynamic Time Warping (fast DTW implementation in C)
Other
1.08k stars 184 forks source link

memory double free or corruption (!prev) #172

Closed HuojianPao closed 2 years ago

HuojianPao commented 2 years ago

when i use the package in pyspark, always get the memory leak error.

for some reason, i cannot get a screenshot, the relevant message is shown below:

Error in 'bin/python': double free or corruption (!prev): [0x00 some numbers] ==== Backtrace: ==== /lib64/libc.so.6(+0x7c619) [0x00 some numbers] /...python paths..../dtaidistance/dtw_cc.cpython-37m-x86_64-linux-gnu.so(warping_path_ndim+0xa0) [0x00 some numbers] /...python paths..../dtaidistance/dtw_cc.cpython-37m-x86_64-linux-gnu.so(warping_path+0x1a) [0x00 some numbers] /...python paths..../dtaidistance/dtw_cc.cpython-37m-x86_64-linux-gnu.so(+0x2fc1e) [0x00 some numbers]

I've tried to fix it, but failed. And really need help, thanks.

wannesm commented 2 years ago

Can you also provide the inputs to warping_path_ndim? With the current information it is difficult to analyse whether there is a bug.

HuojianPao commented 2 years ago

OK, I got the reason.

I used numpy.double to create array like np.array([], dtype=np.double). after some calculation, the figure changed to 0.00000+000 1.213312312421e-198 1.2131321321231e-219 ...and something like that.

seems like python support these, but when input to the function, it goes to error.

I changed type to numpy.float32, then it's ok.

wannesm commented 2 years ago

Thanks for looking into this. These values often also indicate that you could be using an uninitialised array. additionally, an uninitialised array would also trigger a memory error. So I recommend analyzing this also.