HDFGroup / HDF.PInvoke

Raw HDF5 Power for .NET
http://www.hdfgroup.org/HDF5
Other
80 stars 29 forks source link

Memory leak issue when trying to open using H5O.open #174

Closed dularish closed 3 years ago

dularish commented 3 years ago

Hello,

Our application has to HDF5 files, and we are using HDF.PInvoke to read. We are observing memory leak issue with one of the methods

var hid =H5O.open(loc_id,name)//This line allocates memory for the process
//Doing something with hid
H5O.close(hid)//This line doesn't clear the allocated memory

Is there any solution available or am I doing something wrong?

As far as I could read in the documentation, H5O.open method can be closed with H5O.close method, and I expected allocated memory to be cleared.

I find this issue highly problematic because in order to read 100MB HDF5 file, the memory allocated for the process is around 3.3 GB, and almost 3.2 GB memory allocation takes place when H5O.open is called .

Apollo3zehn commented 3 years ago

Do you open groups, datasets or other resources with that file id? You need to close them too to before closing the file id. Also I am not sure if H5O.close works. I use H5F.close instead. You know if you have properly closed everything if you are able to delete or rename the file. If the file is still locked, there is something wrong.

There are also options to force close the file (https://github.com/HDFGroup/HDF.PInvoke/issues/169).

dularish commented 3 years ago

I have tried to get all opened groups closed with "H5G.close" and all datasets opened with "H5D.close". Still at the end of all reading processes even after calling "H5F.close", my HDF file remains locked. I need to investigate further, and try to update here soon.

Thank you very much for your reply.

dularish commented 3 years ago

Hello,

As you had suggested, the root cause of the memory allocation issue is not closing properly datasets, groups with "H5D.close", "H5G.close" respectively. Once they were all corrected, there was no more considerable memory leak issue.

Thank you very much for your hint. It was not about problem with product, but incorrect usage.