Closed CaptKrasno closed 2 years ago
@CaptKrasno -- I think we should split out the beam angle nomenclature discussion from the defining coordinate frames discussion, and maybe have a separate coordinate system discussion for each sensor. The comments here are going to get fairly tangled up. (e.g. my response to this would have 4 different threads, given that every all but one of these proposed conventions differs from what I'm currently using.)
Are you willing to close this issue and create separate ones to make it easier for people who only care about one sensor to engage with the discussion?
@CaptKrasno - Since it was already fresh in my head, I forked the beam angle discussion to here: https://github.com/apl-ocean-engineering/hydrographic_msgs/issues/9. I'll wait to respond to the rest of this proposal until you say whether you'd be amenable to splitting it up into per-sensor chunks.
@CaptKrasno -- I think we should split out the beam angle nomenclature discussion from the defining coordinate frames discussion, and maybe have a separate coordinate system discussion for each sensor. The comments here are going to get fairly tangled up. (e.g. my response to this would have 4 different threads, given that every all but one of these proposed conventions differs from what I'm currently using.)
Are you willing to close this issue and create separate ones to make it easier for people who only care about one sensor to engage with the discussion?
I purposely created this as one discussion for all sensor types. My goal was to maximize coordinate frame constancy across message types. (see general conventions header)
Bearing that in mind, I am happy to split this issue if you think it would be helpful.
- Perceptual sensors should have their z-axis oriented in the direction they are "looking"
- Sensors should have their manufacture defined "forward" direction oriented along the x-axis
I interested in the standard of +Z always being "downrange" from the sensor towards the perceived scene, in part because it echos the pinhole camera coordinate system, not that that's necessarily a great prior.
For orienting the X- and Y- axes, I would consider how the axes affect the signedness viz the conventions typically used by the real sensors. [independent of what we actually call the respective angles]
Using a forward-looking imaging sonar as an example (if only because with a forward-facing sensor, "elevation" and "azimuth" more closely map onto their normal usage), then for +X up, +Y starboard:
If instead we use +X starboard, +Y down:
Similarly, for e.g. a downward looking multibeam, I believe positive cross-track beam angles are to vehicle starboard, implying a rearward facing axis?
Thanks for your feedback @amarburg !
Using a forward-looking imaging sonar as an example (if only because with a forward-facing sensor, "elevation" and "azimuth" more closely map onto their normal usage), then for +X up, +Y starboard:
I agree the orientation of x and y are the most arbitrary. I was going for the philosophy of x points along the direction of forward motion in the normal configuration for all sensors. With this convention DVL, USBL and Multibeam can all have a similar coordinate frame. I am totally willing to modify this though if we get some support.
A comment from @valschmidt from another thread seems relevant here regarding conventions:
My preference to your suggestions is Proposal B (x-forward, y-stbd, z-down for a downward mounted system). This is in keeping with the POS/MV convention and that of Kongsberg. I think it also coincides with the SNAME convention for coordinate systems for modeling ships.
position "elevation" around Y is upwards in the sonar field of view, which agrees with the general convention
positive "azimuth" around X is leftward in the sonar field of view, which disagrees with the general convention for azimuth
I think we need to be careful here. The angles reported by multbeam sonars are not defined by rotations about an axis. rather, an elevation from a plane. See my comment here. This is maybe not the case for other imaging sonars. Maybe someone could comment on that? If so we might need 2 message types or at least a flag in the message. (imaging and multibeam)
I know we are trying to stay away from the angle names here but I think it is somewhat relevant to this discussion. As you point out, elevation and azimuth can potentially have connotations that may not be accurate depending on sensor orientation. I think that is worth mentioning.
Perceptual sensors should have their z-axis oriented in the direction they are "looking"
I interested in the standard of +Z always being "downrange" from the sensor towards the perceived scene, in part because it echos the pinhole camera coordinate system, not that that's necessarily a great prior.
If we're trying to be consistent with a camera coordinate system, there are (at least) two ways to think of that:
For the imaging sonar, I think that every paper I've seen (e.g. Aykin2017, Westman2019, Guerneve2015, Wang2020) has the sonar fan aligned with the sensor axis's X/Y plane, though there isn't consensus on whether X or Y is forward.
As another example, LaserScan.msg uses the X-forward convention, with beam angles specified as rotations about the Z axis.
@CaptKrasno -- I think you should update your proposal for base_link to have x forward / y starboard / z down. This would match the SNAME1950 convention (as well as the convention for every AUV I've worked on.)
- XY-plane is the natural plane to project data onto for visualization. ...
Ah, I like this. It's a better rule off thumb for how we treat sensors that project data onto a 2D plane.
@CaptKrasno -- I think you should update your proposal for base_link to have x forward / y starboard / z down. This would match the SNAME1950 convention (as well as the convention for every AUV I've worked on.)
I wasn't really proposing a coordinate frame to use in general,I think that may be beyond the scope of this package to define. I just put that in as a reference to the standard ROS coordinate frame for a body. https://www.ros.org/reps/rep-0103.html#axis-orientation
Maybe it would be better to just remove that label all together from the diagram and keep the conversation focused on sensors?
* XY-plane is the natural plane to project data onto for visualization.
I'm with @amarburg, this is a good concise explanation. I will update it above.
* However, for a 2D sonar, this would put XY aligned with the sonar fan. If you're doing that, X might as well be centered in the fan / the direction you're looking.
I actually think this works out well. For a 2d sonar an XY projection will give you a "waterfall" view over time which seems like a pretty natural way to view multibeam/sidescan data before projection.
@CaptKrasno -- I just saw that you added my wording as a bullet point, but didn't clean up the other wording that it conflicted with or change the images for multibeam & fls. Is this because you haven't had time (understandable!) or because my wording is perhaps not as clear as we seemed to think it was?
For General Conventions, I would suggest:
Sorry for the delay, I was on field work and this slipped of my radar.
* Sensors that provide 2D data should have axes defined such that their XY-plane is the natural plane to project data onto for visualization. So, for a camera, this results in Z aligned with the optical axis and XY is the imaging plane. For a multibeam or imaging sonar, the sonar fan is in the XY-plane.
I actually think for a multibeam and imaging sonars it makes more sense for the multibeam fan to be the YZ or XZ plane. Most of the time with sidescan and multibeam you are more interested in a "waterfall" type plot rather than the multibeam fan itself. So a "natural" projection could could be a X,time projection. For a 3d imaging sonar I think the Z downrange actually makes more sense since the output is analogous to a depth sensing camera. I am am worried it is not as clear as we may have initially thought and it may be somewhat use case dependent.
So I think I might have to backpedal a little and support picking an axis as downrange. I think it's just a bit more general. The Z axis feels natural to me but I could be swayed to pick any axis. I think Z works particularly nicely with DVL and usbl and is reasonable for sonars.
I think it also is in keeping with the proposed 1D sensor convention
I think that the above discussion and image has converged for everything other than a 2D imaging sonar.
Unfortunately, the papers I'm aware that use imaging sonar in robotics haven't converged on a single convention. However, what they DO agree on is that the Z axis is perpendicular to the imaging plane. That is, Z is vertical for an imaging sonar mounted in a forward looking position.
Of course, this is not compatible with the more-established multibeam conventions. If this was the only thing that they differed on, I'd adopt multibeam conventions for the imaging sonar, but I'm less and less convinced that the SonarImage message type should even try to support data from the newer MBES systems that also provide water column data. (#17)
So ... I think next steps could be:
After our discussion today I have updated the graphic to match the newly proposed multibeam convention. I will leave it here for discussion for a bit then make the changes as we work on the REP.
Nice! Always good to see that the follow-up matches my meeting notes =)
I was briefly confused by the large base_link X axis on the multibeam subfigure -- if it's easy to edit that diagram, maybe add a separate column showing base_link and SNAME conventions?
Resolved, and the README documents the conventions that we converged upon.
I will be editing this first comment as we adapt it. To comment on a part please quote it and reply to preserve the context of your comment.
-Kris
General Conventions
These conventions should be mostly extendable to any perceptual sensor as more are added.
Multibeam
Note these two angles each define a cone, not a plane. This is important as it corresponds to the physical multibeam geometry. (TODO: I'm not sure if this is the way that other imaging sonars work can someone chime in? @lauralindzey @valschmidt)
DVL
USBL
The position of the target in the USBL frame should be reported as with:
I stayed away from angles because they can be a little ambiguous at times not to mention loss of precision near polls. I also didn't want to use absolute position since that means you can't do ray tracing or other sound speed corrections in post.
Proposal: Conversions package
I think it might help adoption and consistency if we provide some code/nodes/libraries for conversions. For example,