apl-ocean-engineering / sonar_image_proc

Other
12 stars 4 forks source link

How to use Bearing Table? #22

Closed HikariS97 closed 2 years ago

HikariS97 commented 2 years ago

Hi, @amarburg

Thanks for the work of liboculus. I have benefited a lot from this project.

I'm working on sonar-based odometry, and I find that the rotation estimation is relatively less precise than translation estimation. I think the error is caused by incorrect mapping of image from polar coordinate into cartesian coordinate.

Do you have any ideas to using bearing table received from oculus sonar to correctly generate a carte img?

Thanks a lot, Hikari.

amarburg commented 2 years ago

Hi Hikari, sorry for the delay.

Per the comment in the Oculus.h header file we read in the array of bearings in each sonar packet and convert them to floats, which are then used to create the ROS messages, etc. We assume those bearings are the center of each sonar's beam, and empirical evidence is that the bearings are coarsely correct (e.g. objects to the left of the sonar appear to the left in the sonar image, straight edges in the world appear as straight lines in sonar). But there may still be some unresolved bugs.

Beyond that I'm not quite sure I understand your question?

HikariS97 commented 2 years ago

Hi Hikari, sorry for the delay.

Per the comment in the Oculus.h header file we read in the array of bearings in each sonar packet and convert them to floats, which are then used to create the ROS messages, etc. We assume those bearings are the center of each sonar's beam, and empirical evidence is that the bearings are coarsely correct (e.g. objects to the left of the sonar appear to the left in the sonar image, straight edges in the world appear as straight lines in sonar). But there may still be some unresolved bugs.

Beyond that I'm not quite sure I understand your question?

Hi, amarburg.

Thanks for your reply. I saw that bearing array is commented out in the header file. You said ' we assume those bearings are the center of each sonar's beam'. Could I ask where're those bearings? I assume that beams are equally distributed over the FOV by far.

amarburg commented 2 years ago

In each data packet from the sonar, there are actually three data structures in a row:

They are just one after the other. There is no top-level struct/union supplied by Oculus which contains the bearing array as a member, for example.

What we do in liboculus is map those last two data structures into C++ classes which let them read them. See here

SimplePingResult._bearings is then an instance of the BearingData which provides array-like access to the data, and also does the conversion from Oculus's "100ths of a degree" format to float degrees.

amarburg commented 2 years ago

I think I can see the issue. In sonar_image_proc, when we make the map to do the cartesian to polar coordinates, we also assume equally spaced beams here. This was a quick simplification where we also assumed the beams were equally spaced and equally distributed across the field of view.

We should use the bearing data to make a more accurate map..

HikariS97 commented 2 years ago

Hi, amarburg. Thanks for your reply.

Yes, you understand my question.

Could you explain more about also does the conversion from Oculus's "100ths of a degree" format to float degrees. ? I totally have no idea about this.

Thanks.

amarburg commented 2 years ago

Sure. The bearing data is stored in an array of shorts after the header, so it's effectively like the data from the sonar is:

struct TheWholeSonarPacket {
   OculusSimplePingResult   header;
   short   bearings[ header. nBeams ];
   void *image_data;
};

The data in the bearings[] array is the angle of each beam in 100ths of a degree. Our abstraction just converts that to a floating point value in degrees rather than make the end user remember that it's in 100ths of a degree.

HikariS97 commented 2 years ago

Thank you amarburg!