Hi @facontidavide ,
First, I would like to thank you for this amazing repository! The mapping module is really performant, and I have been testing it for 3D planning with a quadrotor in Gazebo SITL.
While working with the insertPointCloud implementation (here), I encountered performance and stability issue related to inf values in pc.points. I would like to share the details and potential impact.
I traced back the issue to the updateFreeCells(from) line present(here).
Specifically, this section of code in updateFreeCells() seemed to cause the problem:
Some points in the input point cloud (pc.points) had inf values.
These values propagated into new_point, leading to all its components (x, y , z) being NaN.
These invalid new_point values were added to the _miss_coords vector.
This caused:
Extremely Large coord_end Values:
Examples include values like -2147483648, which led to expensive computations during ray traversal.
Performance Bottlenecks:
Processing this loop took approximately 9 seconds for just one map update.
System Instability:
Before addressing this issue, my PC would occasionally hang or become unresponsive during this process, making further testing impossible.
Invalid Map Updates:
Rays generated with such large values caused invalid updates (e.g., points outside the map bounds), compromising map integrity.
Suggested Fix:
I made a small change to the insertPointCloud method in bonxai_map/include/bonxai_map/probabilistic_map.hpp to handle this edge case:
if (squared_norm >= max_range_sqr) {
const Vector3D new_point = from + ((vect / std::sqrt(squared_norm)) * max_range);
if (std::isfinite(new_point.x()) && std::isfinite(new_point.y()) && std::isfinite(new_point.z())) {
addMissPoint(new_point);
}
} else {
addHitPoint(to);
}
This modification ensures:
Only finite new_point values are processed.
Points that result in vect becoming inf are ignored.
This small fix significantly reduced processing time and completely stabilized my PC, preventing hangs or crashes.
I would love to hear your thoughts or suggestions on this. If you think there is a better approach or have any feedback, I would greatly appreciate it. I would be happy to create a pull request for this.
Hi @facontidavide , First, I would like to thank you for this amazing repository! The mapping module is really performant, and I have been testing it for 3D planning with a quadrotor in Gazebo SITL.
While working with the
insertPointCloud
implementation (here), I encountered performance and stability issue related toinf
values inpc.points
. I would like to share the details and potential impact.I traced back the issue to the
updateFreeCells(from)
line present(here).Specifically, this section of code in
updateFreeCells()
seemed to cause the problem:Details:
(pc.points)
hadinf
values.new_point
, leading to all its components (x
,y
,z
) beingNaN
.new_point
values were added to the_miss_coords
vector.This caused:
Extremely Large
coord_end
Values: Examples include values like -2147483648, which led to expensive computations during ray traversal.Performance Bottlenecks: Processing this loop took approximately 9 seconds for just one map update.
System Instability: Before addressing this issue, my PC would occasionally hang or become unresponsive during this process, making further testing impossible.
Invalid Map Updates: Rays generated with such large values caused invalid updates (e.g., points outside the map bounds), compromising map integrity.
Suggested Fix: I made a small change to the
insertPointCloud
method inbonxai_map/include/bonxai_map/probabilistic_map.hpp
to handle this edge case:This modification ensures:
new_point
values are processed.vect
becominginf
are ignored.This small fix significantly reduced processing time and completely stabilized my PC, preventing hangs or crashes. I would love to hear your thoughts or suggestions on this. If you think there is a better approach or have any feedback, I would greatly appreciate it. I would be happy to create a pull request for this.