Closed Cova8bitdots closed 1 year ago
Hi, I think the deleteEntity function is what you need. see the corresponding use case.
Regards, Paul
@prascle Thank you for your quick response. The problem has been solved!!
I don't have to allocate more than 70GB memory any more :)
I am working on a program to determine the size of the Boundary Box of a point cloud when combined from several files, each divided into dozens, without actually combining them. Although I sequentially read files within a for loop and perform calculations, the memory usage increases each time a file is read. I have tried using del on variables loaded with CloudComPy.loadPointCloud() and also used gc.collect(), but the issue persists. Is there an UnloadAPI available to release the memory of the files loaded using the API?