CCNYRoboticsLab / imu_tools

ROS tools for IMU devices
Other
913 stars 428 forks source link

Wrong Initial Orientation #141

Closed jacklu333333 closed 3 years ago

jacklu333333 commented 3 years ago

Hi, below is my output data from imu_filter_madgwick.

This is the cmd I use since the distro I am using is melodic and the world frame is x-north y-west z-up. rosrun imu_filter_madgwick imu_filter_node _use_mag:=false _publish_tf:=false _world_frame:="nwu" /imu/data_raw:=/imu_base /imu/mag:=/mag_base As you can see here, the orientation is rotated Z-axis by -90 degrees. What I am expecting is that the package should output close to x=y=z=0, w=1. In this case, it means the initial state orientation. Did I miss config the parameters or misunderstand it?

orientation: 
  x: -0.00099724882288
  y: -0.00638916875723
  z: -0.705309044765
  w: 0.708870464465
orientation_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity: 
  x: -0.00128281700022
  y: 0.000733038285838
  z: 0.000818559419185
angular_velocity_covariance: [0.01, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.01]
linear_acceleration: 
  x: 0.0941536899296
  y: 0.0699902193121
  z: 9.87016819049
linear_acceleration_covariance: [0.01, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.01]

Thanks in advance.

mintar commented 3 years ago

If you use a magnetometer, the yaw orientation will reflect the direction to magnetic north.

jacklu333333 commented 3 years ago

Thanks for fast reply. I haven't dug into the algorithm, but is it possible to initialize the orientation with x=y=z=0 w=1, instead of arbitrary orientation?

Because I am using the package as the orientation representation for the robot pose instead of the world direction.

mintar commented 3 years ago

There was a similar issue a while ago (#120) that was fixed by PR #121 in the ROS2 version of this repo. I have just ported #121 back to noetic: https://github.com/ccny-ros-pkg/imu_tools/commit/011d0009ac804380756ea6fb0694e6cc1b1146ed. You'll have to build from source to get this feature until the next binary release.

With this addition, there is a parameter called yaw_offset that you can tune to shift the yaw to your liking.

Note however that you are probably taking a lot of shortcuts here. This is how a (wheeled) robot is usually set up:

jacklu333333 commented 3 years ago

Thank you for the fast and detailed reply. It helps a lot. The noetic work perfectly.

jcremona commented 3 years ago

There was a similar issue a while ago (#120) that was fixed by PR #121 in the ROS2 version of this repo. I have just ported #121 back to noetic: 011d000. You'll have to build from source to get this feature until the next binary release.

With this addition, there is a parameter called yaw_offset that you can tune to shift the yaw to your liking.

Note however that you are probably taking a lot of shortcuts here. This is how a (wheeled) robot is usually set up:

  • imu_filter_madgwick fuses gyro, accelerometers, and magnetometer into an orientation in the IMU message.
  • robot_localization fuses the output of the previous step with the wheel odometry and publishes the odom -> base_link TF transform. Usually you'll set up robot_localization to just use the difference in orientation, so the absolute value of the yaw angle doesn't matter.
  • amcl then localizes the robot to the map using a laser scanner and publishes a map -> odom correction TF.

Dear @mintar, thank you for your great work. I have a similar problem. When you say this, you mean the differential parameter of robot_localization (the one that allows to compute velocities from poses)? You answered a similar comment here: https://github.com/ccny-ros-pkg/imu_tools/issues/123#issuecomment-686311772 In my particular case, the IMU doesn't have a magnetometer.

jacklu333333 commented 3 years ago

Hi @jcremona, I did not fully understand what is your problem, but I guess you want to set the initial offset of the orientation. Hope the following will help you.

First, you need to do clone the repo to your workspace(something like ~/catkin_ws/).

Seconds, go to the folder of the repo(something like ~/catkin_ws/imu_tools), and type checkout git checkout 011d0009ac804380756ea6fb0694e6cc1b1146ed

and go to your workspace(~/catkin_ws/). Build the package by typing catkin_make or catkin build

In your launch file, you can specify the yaw_offset parameter that you can adjust the initial angle of yaw in radius. It might have build errors for a different distro of ROS. In my case, I am using melodic distro.

However, just like @mintar, this is a shortcut method. If you have an onboard IMU sensor for the robot base, it is best to calculate along with the wheel encoder. In my case, it did improve linear movement but not the rotation movement.

Hope this will help you.

Best regards, Jack Lu

jcremona commented 3 years ago

Hi, @jacklu333333 thanks for your answer. In my case, I have an IMU without a magnetometer, and I wanted to fuse the IMU with other sensors using robot_localization. For this, I used imu_madgwick_filter, as it gives me an orientation to rotate the acceleration vector. I also wanted to take advantage of that orientation and fuse it with the rest of the sensors. According to @mintar 's answer, "Usually you'll set up robot_localization to just use the difference in orientation, so the absolute value of the yaw angle doesn't matter". This is achieved by using the differential parameter of robot_localization? He answered something similar here: https://github.com/ccny-ros-pkg/imu_tools/issues/123#issuecomment-686311772 Thanks!

mintar commented 3 years ago

No, I don't mean the differential parameter. If you look at the configuration I linked in https://github.com/ccny-ros-pkg/imu_tools/issues/123#issuecomment-687144177, you'll see that the differential parameter is false.

I've just double-checked, and when I said "You should configure your localization package (robot_localization or robot_pose_ekf) to just use the difference in yaw orientation, not the absolute value", that is only true for the (obsolete) robot_pose_ekf.

In my robot_localization config, I do it like this:

# Each sensor reading updates some or all of the filter's state. These options give you greater control over which
# values from each measurement are fed to the filter. For example, if you have an odometry message as input, but only
# want to use its Z position value, then set the entire vector to false, except for the third entry. The order of the
# values is x, y, z, roll, pitch, yaw, vx, vy, vz, vroll, vpitch, vyaw, ax, ay, az. Note that not some message types
# do not provide some of the state variables estimated by the filter. For example, a TwistWithCovarianceStamped message
# has no pose information, so the first six values would be meaningless in that case. Each vector defaults to all false
# if unspecified, effectively making this parameter required for each sensor.
# see http://docs.ros.org/en/noetic/api/robot_localization/html/configuring_robot_localization.html
odom0_config: [false, false, false,   # x y z
               false, false, false,   # roll pitch yaw
               true,  true,  false,   # vx vy vz
               false, false, true,    # vroll vpitch vyaw
               false, false, false]   # ax ay az

# [ADVANCED] When the node starts, if this parameter is true, then the first measurement is treated as a "zero point"
# for all future measurements. While you can achieve the same effect with the differential paremeter, the key
# difference is that the relative parameter doesn't cause the measurement to be converted to a velocity before
# integrating it. If you simply want your measurements to start at 0 for a given sensor, set this to true.
odom0_relative: false

imu0_config: [false, false, false,   # x y z
              false, false, true,    # roll pitch yaw
              false, false, false,   # vx vy vz
              false, false, true,    # vroll vpitch vyaw
              true,  false, false]    # ax ay az

imu0_relative: true

In other words, I'm using vx, vy and vyaw from the odometry (so only the velocities) and yaw, vyaw and ax from the IMU. So I am actually using the absolute yaw angle from the IMU, but since the parameter imu0_relative is set to true, the first angle reported by the IMU on startup will be the starting point (0) - it doesn't matter what the real absolute angle is.

Keep in mind that this is all in the odometry coordinate frame. So the robot odometry will start at x = 0, y = 0, yaw = 0 when robot_localization is started; you'll additionally need a "real" localization package like AMCL on top that will use laser scans or similar to localize the robot in the map frame.

See http://docs.ros.org/en/noetic/api/robot_localization/html/configuring_robot_localization.html for further info.

jcremona commented 3 years ago

Thank you very much for your kind reply.