Closed SteveMacenski closed 2 years ago
I would actually suggest not doing this - for two reasons:
My thought process here is that many robots being built today are not being based on safety rated lidars (Ouster, RP-lidar, Sense photonics, etc) or lasers without these kind of built in capabilities, which I think is pretty limited to SICKs. I don't think Hokuyos offer this internally, but I could be mistaken.
So how can we get "most" of the benefits of a safety system to prevent avoidable collisions and in a way that many people can use it. I'm not saying it certified or certifiable, but its better than nothing or a bespoke solution. Certainly it won't work as intended if you overload your CPU, under-power your networking, or have other system-level issues. But for the 99% of cases where things are working correctly, it should offer substantial benefit that wasn't previously there. "Better is good". On (1), if people use the navigation2 stack today and run into someone, it has the same liability, so I'm not sure trying to make it safety functionally would be worse.
Putting it in the controller server makes alot of sense. That's the analog of the place I built it into for my Jetson series robots. I figured I'd make it a component and then have that as an option alongside the controller server or as a stand alone node.
I don't think Hokuyos offer this internally, but I could be mistaken.
Hokuyo does offer a safety laser (as do OMRON and Pilz). They don't have as sophisticated of a controller to go with it as the SICK FlexiSoft or the Pilz controllers (which limits the number of zones that can be defined, especially with regards to speed-related zones).
Got it. I suppose I just didn’t use it then. Not sure if the 20-LX’s I used had that. I know they have a few lines though outside of that form factor.
A set of rough requirements
base_frame
reference (e.g. 0,0 will be the origin of the base_frame parameter, so all polygons are defined relative to it)Merged!
While this is an area I've typically not found belonging to navigation, this has come up so many times maybe its worth just including it. The goal is to make a node that takes in raw sensor data from a lidar (2D, 3D, I suppose a depth camera, just something that gives ranges in the forward motion direction) and either stop or slow the robot to avoid collisions.
We've added heartbeat server alive + lifecycle transition support and this falls into the same category of safety awareness / promises we can round it off with (e.g. we now know within a timeout everything is running properly and activated, this will now ensure no collisions using raw safety data which nicely rounds off the minimum needs for a safe robot regardless of algorithm for planning and control or localization quality).
In most industrial situations, you might use something like a SICK lidar that has this capability built in and is safety rated. For smaller robots falling under less strict functional safety requirements and budgets, a laser like this may be out of scope and needing a solution based on the ROS sensors they have available. It is not a suitable safety certification alternative, but will give you comparable non-certified capability as long as the sensor and ROS driver are publishing information.
There are 2 major options (probably do both and give the user option of which to use):
This can be trivially made in about a day:
Useful for both autonomous navigation and as an important element in an assisted teleop system (does this replace the assisted part?)