Closed severin-lemaignan closed 10 years ago
Is there a particular benefit to merging it in there, rather than releasing it from here?
Well, nothing major, but it would make this layer kind of a 'standard one' in the navigation stack, readily available for people trying to do 2D navigation with sonars/IR sensors.
:+1: But I would integrate within obstacle_layer, mostly for completeness: if it listen for LaserScans and PointClouds2s, why not for Ranges? That is the behavior I would expect from it, and it's not there because sensor_msgs/Range came much later, I suppose.
For example, TurtleBot2 publish bumpers readings as a (somehow arbitrary) PointCloud, instead of as fixed-range Ranges because it uses them within navi stack.
There's a binary build of the range_sensor_layer making its way through hydro right now.
The reason why it is not integrated with obstacle_layer is because they treat data in very different ways. ObstaclesLayer treats the data it receives as absolute truth, clearing all points between the robot and the reading and marking the end point as occupied. The RangeSensor layer creates a probability map that is fuzzier in its logic.
The Range_sensor_layer is currently in shadow-fixed. Unless there are major objections, I think that having the binary fits the bill of making it available for everyone to use.
:+1:
+1 to keeping this separate from the core nav stack and just releasing binaries.
Fine for me. I close the issue then.
The
RangeSensorLayer
has been used and tested by at least 3 people, and seem to be useful. I would suggest to 'mainline' it inros-planning/navigation
.To be nicely integrated, I guess ros-planning/navigation#211 need to be addressed.