Closed ka-petrov closed 6 years ago
The P matrix, calculated in the E-step, has M times N number of elements, where M and N are the size of fixed and moving point clouds, respectively.
This means that you are allocating (40K)^2 floating numbers. This is about 102 GB of memory.
OK, thanks
i have the same problem , what can i do ?? any one solved it?? @imaginary-unit
@hodaatef What I ended up doing is just using a random sub-sample of my point cloud, which fits in memory. That's the only solution I can think of.
Random subsampling might be sufficient for you. Something better might be spatial subsampling: http://pointclouds.org/documentation/tutorials/voxel_grid.php#voxelgrid
That way you will be able to make sure that the subsampled point cloud has roughly the same spatial distribution as the original point cloud.
@siavashk I did voxel downsampling from open3d but the point cloud is still too large and the memory error still occurs. Any insights on how I should proceed?
Thanks in advance.
@sandwich25 do you have an estimate for how much your point cloud was downsampled? You can do this by computing _number_of_points_afterdownsampling / _number_of_points_beforedownsampling. I would suggest doing more downsampling until the issue goes away.
I'm getting the following stack trace on an attempt to register 3D point clouds with about 40K points each:
This happens both with
rigid_registration
andaffine_registration
. Should this algorithm be so memory consuming? How can I estimate the memory usage in 3D from the number of points? (I have 16 Gb RAM if it matters)