An issue arose while using the sonar_poincloud.py utility to troubleshoot the sonar FOV.
We currently publish "slices" of points corresponding to the sonar FOV, where each point is colored based on the return value from the sonar. Here's the point cloud with 2 slices:
This image actually shows the lag in generation, the oculus is at nearly 0-deg but the returned cloud is quite far behind.
This results in many, many points and it can be very difficult to see which points have a positive return value and which are zeroes. I had tried to simply set the alpha channel of the values that were 0 to 0, but rviz does not care about the published alpha channel (see pointcloud documentation, issue here, and issue here). Even had this worked, it doesn't solve the issue of number of points.
During a discussion with @lauralindzey, she suggested splitting the troubleshooting into two options:
A mesh showing the sonar current FOV with a set alpha value and no support for sonar return values
A point cloud only showing positive return values.
This PR is the first half of that implementation.
Implementation
This PR adds a new node, sonar_fov.py that subscribes to a /sonar_image and publishes a MarkerArray.
The node procedurally generates a mesh corresponding to the FOV of the sonar with the current configuration (frequency, range). Rviz loads that mesh as a triangle list (which is a list of three vectors defining a surface triangle) to generate the visuals.
The mesh isn't perfect (I haven't generated the arc points along the elevation) but I think it works well for quick analysis of FOV.
This solution is:
easier to see-through
fully encompasses the FOV as opposed to user-selected slices of returns
is much faster to generate and render in RVIZ
Pulls parameters directly from the sonar image message as opposed to user defined values.
Publishing Sonar FOV as a Marker Array
Goal/Background
An issue arose while using the
sonar_poincloud.py
utility to troubleshoot the sonar FOV.We currently publish "slices" of points corresponding to the sonar FOV, where each point is colored based on the return value from the sonar. Here's the point cloud with 2 slices:
This image actually shows the lag in generation, the oculus is at nearly 0-deg but the returned cloud is quite far behind. This results in many, many points and it can be very difficult to see which points have a positive return value and which are zeroes. I had tried to simply set the alpha channel of the values that were 0 to 0, but rviz does not care about the published alpha channel (see pointcloud documentation, issue here, and issue here). Even had this worked, it doesn't solve the issue of number of points.
During a discussion with @lauralindzey, she suggested splitting the troubleshooting into two options:
This PR is the first half of that implementation.
Implementation
This PR adds a new node,
sonar_fov.py
that subscribes to a /sonar_image and publishes a MarkerArray. The node procedurally generates a mesh corresponding to the FOV of the sonar with the current configuration (frequency, range). Rviz loads that mesh as a triangle list (which is a list of three vectors defining a surface triangle) to generate the visuals.The mesh isn't perfect (I haven't generated the arc points along the elevation) but I think it works well for quick analysis of FOV.
This solution is:
Results
Here are some videos showing the implementation. Bag played back at half speed: https://user-images.githubusercontent.com/97049012/202525681-275c750a-1ece-4f88-ad29-f06779c31efa.mp4
Bag played back at full speed: https://user-images.githubusercontent.com/97049012/202525789-c2d4e597-1ad1-42c7-80f9-f9985a6b3746.mp4
At full speed it's pretty clear how bad the point cloud lags.
Bag with only the mesh https://user-images.githubusercontent.com/97049012/202554446-85119c67-ffe0-4aea-a4b0-2cacb458e728.mp4
Only mesh, fixed transparency issues https://user-images.githubusercontent.com/97049012/202739634-f3a761a4-1b09-4eb4-aa64-87808bad7ffe.mp4