naurril / SUSTechPOINTS

3D Point Cloud Annotation Platform for Autonomous Driving
GNU General Public License v3.0
785 stars 206 forks source link

Out-of-memory error when playing a longer dataset (>400 frames) #110

Open wittmeis opened 1 year ago

wittmeis commented 1 year ago

When I play a larger dataset the JS heap size increases to the maximum of ~3GB and the browser throws an out-of-memory error.

Any idea how to fix this? From the heap snapshot it looks to me as if the old worlds are not properly deleted.

wittmeis commented 1 year ago

I just had a look at the heap snapshots. After calling world.deleteAll(), Lidar.pcd still exists. This is definitely one of the reasons for the increasing heap size.

I simply set Lidar.pcd = null in Lidar.remove_all_points(). I am not sure whether this is the correct fix but it reduced the heap size quite a bit.

However, the webglGroup in the world object seems to be also an issue. Hence, I also set that to null in world.deleteAll().

Then the heap snapshot looks like this after deleting all worlds by calling editor.data.worldList.forEach((w) => w.deleteAll()):

image

It seems that there are further references in the annotation object that prevent the garbage collection?

naurril commented 1 year ago

the main problem could be here, try changing this line to return distant. or try the branch fusion, I fixed some bugs related to memory leaks but haven't had time to merge them into other branches yet.

wittmeis commented 1 year ago

Thanks, I tried the proposed solution with return distant but unfortunately the leaks are still there.

I will have a look at the fusion branch as well though.

nnop commented 1 month ago

Have you solved the problem with the fusion branch?