Closed maurock closed 1 year ago
Hi @maurock I think this was reported in #63 and fixed in #67! But I haven't cut a new release with those changes. If you install the library from source, it should fix the issue. Let me know if that works!
@maurock this should be fixed in 0.29.6. Is this the case?
@fwilliams - I have 0.29.6 installed via pip and it still has memory errors (maybe new ones?) If I create a mesh from the points/vertices returned from make_mesh_watertight and then try to do anything with that new mesh (using another library) I get memory errors (error corrupted size vs. prev_size
or corrupted double-linked list
).
I also noticed that the outputted points are of type int64 - is this correct? This seemed strange to me as it would remove precision of the original mesh.
Thanks for a great library - I've only recently started using it, but found it awesome.
Ignore my comment. I was swapping the outputs (points and verts) and that was causing my issues. It works from the pypi version, as well as built from source.
@gattia And the memory leak is okay?
If it is, I'll close this issue!
Ya, it seems to have gone away. Not sure why exactly it was happening but it seems to have had to do with me passing faces/points in the incorrect order to my next set of steps (related to pyvista / pcu.orient_mesh_faces).
Thanks for the quick replies.
Hi @fwilliams, I have been exploring all the features of this great library and found a possible memory leak in
make_mesh_watertight
. I am converting many shapes from the ABC and ShapeNet dataset to watertight meshes. I use a resolution of 50000. After about 200 meshes, the RAM usage is ~20GB (python 3.8, macOS M1).Please let me know if there is any additional information I could provide to help pinpoint the problem. I'll look more closely to the source code as I can as soon as I have some time (currently it's rebuttal time). Thank you as always!
EDIT: My current workaround is a bash script that runs the python code for some time, then shuts down the interpreter, and runs the code again. Maybe this could be of help for people encountering the same issue.