cra-ros-pkg / robot_localization

robot_localization is a package of nonlinear state estimation nodes. The package was developed by Charles River Analytics, Inc. Please ask questions on answers.ros.org.
http://www.cra.com
Other
1.39k stars 892 forks source link

[Question] Can a yaw offset exist between the utm and map frames when using GPS data? #727

Closed pepisg closed 2 years ago

pepisg commented 2 years ago

Hi! I'm using robot_localization to implement a GPS waypoint follower in nav2. For that I make use of the fromLL service to convert GPS coordinates into poses in the map frame. From what I saw in the code, this service takes the GPS coordinates into the utm frame and later transforms them to the map frame assuming there is no rotation between them.

I picked up the work from someone else. He had pointed out that a yaw offset may exist between the utm and map frames which would cause the returned point to be placed on a rotated coordinate system not corresponding with the actual orientation of the latter, thus making the returned Cartesian point invalid.

However is my understanding that those frames have to both follow the ENU convention, meaning that both should have their +x axes pointing to the east. I think that's also stated here.

I tested this on gazebo placing a robot with different initial orientations and running RL with the setup proposed in the tutorial and regardless of the heading of the robot when RL was launched the map frame always seemed to point east (which seems to confirm that it follows the ENU convention). when running tf2_echo I got the following quaternion in all scenarios:

Rotation: in Quaternion [0.000, 0.000, 0.001, 1.000]

Can this offset exist? If that is the case, should it be accounted for in the fromLL service?

Thanks!

PD: This same question was already posted in answers.ros.org without success.

PD2: There is an ongoing discussion in the nav2 PR in case you want to take a look.

SteveMacenski commented 2 years ago

@ayrton04 any thoughts? This is a topic for the Nav2 project and I'm definitely not an expert on GPS topics

ayrton04 commented 2 years ago

What do we mean when we say the map frame always pointed east? The map frame is totally independent of the UTM frame. In fact, the map frame has no "knowledge" of east.

Take navsat_transform out of the picture for a moment. Let's say you have a robot and only fuse its wheel encoder odometry. Your EKF output is configured to be in the map frame. When your robot first starts up, it will be at the origin in the map frame, that is, it will be at (0, 0) and pointing along the X axis. But that will be true no matter which way your robot faces on earth. Its map coordinate frame is just defined by which way it's facing at startup.

However, if you are fusing absolute yaw from an IMU (magnetometer) in your map-frame EKF, then obviously the robot's 0 heading will be whatever the IMU thinks is 0 heading.

But all of that is independent of the UTM frame.

Anyway, I am AFK, so I can't get too deep into this until next week at the earliest (but next week is a major trade show, so time will be tight). I need to take the time to understand your issue better.

SteveMacenski commented 2 years ago

In fact, the map frame has no "knowledge" of east.

My comment as well

pepisg commented 2 years ago

I see. I thought for using RL with GPS data it was necessary to provide absolute yaw measurements.

So, if the orientation of the map frame is independent of the utm frame (which from the docs I think does head east), how does RL know the angle offset between both frames? from what I understand from the docs this offset is the exposed yaw_offset parameter, however this is a fixed value probably hardcoded on a launch file so I don't see how this could work if the map frame points wherever the robot is facing at startup. And even if the yaw_offset could be changed at runtime it would require knowing the absolute heading at some point, so why not providing it in the first place?

This angle is necessary IMO to return an accurate value on the /FromLL service, as the result would be totally different if say the map frame was pointing east or if it was pointing west.

I'm aware things may change when taking out the navsat_transform node out of the equation, however I think is safe to lookup this under the assumption that it is being launched and RL is being used with GPS data since the fromLL service is offered by navsat_transform anyway.

ayrton04 commented 2 years ago

assuming there is no rotation between them

If memory serves, that's not what that code is doing. GPS positions have no orientation component to them. They're just positions. That is still true when we convert the lat/long to UTM coordinates. The line of code above the one you linked applies the transform to the UTM position to get it into the map frame. But because that transform has an orientation to it, the resulting pose will, too. But we only care about the position, so we zero out the orientation. Why I didn't just store the UTM position in a vector is beyond me, but then I am reading the code on my phone and may be forgetting something. I don't spend much (read: any) time looking at this code these days, sadly.

In any case, I'll take a look at this early next week and we'll sort out the issue. Thanks for you patience, all.

ayrton04 commented 2 years ago

Sorry, I should clarify. You are likely already aware of much of this, but I want to cover everything so we're on the same page.

The earth-referenced heading is required for navsat_transform_node, but it is not required for the EKF.

The UTM grid is aligned such that +X is east, and +Y is north. The GPS gives us our position within the UTM grid, but not our orientation. To get that, we need an earth-referenced heading. That's why you need an IMU with a magnetometer. But most IMUs report 0 at magnetic north, whereas we need it to read 0 at geographic east. That's what the yaw_offset and magnetic_declination_radians parameters do. They make it so navsat_transform_node is aware of our heading in the UTM frame. All of this logic exists just within navsat_transform_node.

So now we know our full pose in the UTM frame. But our robot is not driving around in the UTM frame. It's operating in its own coordinate frame that is relative to wherever it started. There are exceptions to this, but for now, let's just assume that when you turn your robot on, your map frame EKF is fusing just wheel encoder odometry velocity and angular velocity from an IMU. When the EKF starts, the robot will be at (0, 0) with a 0 heading. In rviz (if viewing in the map frame), it will appear to be facing to the right, because that's a heading of 0 in the map frame.

So now we know our pose in the map frame, and we know our pose in the UTM frame. That means we can compute a transform from one frame to the other. So now we can take a lat/long, convert it to UTM coordinates, and then transform it to the map frame. It's just a position, though. There's obviously no orientation component to a GPS position. We can then publish that map-frame position, and fuse it in the map-frame EKF. That transformation is all navsat_transform_node is doing.

Note that if your robot turned on and drove around for a bit, and then you started up navsat_transform_node, this logic would still work. You'd still have your map-frame pose and a UTM-frame pose, and could compute a transform between them. The same goes for the situation where you start indoors and then move outdoors and get your first GPS fix.

So I guess I want to better understand the actual issue. In what way are the values returned by fromLL not desirable?

pepisg commented 2 years ago

Hi! Thanks for your clarification, I think I was confused because I have always been using absolute orientation data following the ENU convention so I did not think it was possible for the map frame to point differently. However I tested today an IMU that follows the NWU convention and as you pointed out, there is now an offset of 90 degrees between map and utm. This, however was accounted for in the response of the /fromLL service, which at the end always returns the point correctly on the map frame.

So, to conclude, everything is working correctly, I apologize for the confusion. Thanks a lot for helping me understand this better and sort this out.

ayrton04 commented 2 years ago

No need for apologies! If anything, I should apologize for the ridiculous delays in responding. Finding time is hard these days.