CCNYRoboticsLab / imu_tools

ROS tools for IMU devices
Other
931 stars 432 forks source link

How to get x, y, z values from IMU? #174

Closed tsadarsh closed 1 year ago

tsadarsh commented 1 year ago

This is a very basic question. Please bear with me. I tried doing my research but I could not find a satisfactory answer.

I am running ROS2 Humble in Pop!_OS 22.04 LTS. I am trying to get odometry data for my robot. I am having an Intel Realsense D435i which has an integrated IMU. I could visualise the IMU data in Rviz by running the imu_filter_madgwick node. Here is a detailed discussion that happened with @MartyG-RealSense.

I am now able to see the roll, pitch, and yaw motion of the camera. Here is a demo video that I made. How do I get the x, y, and z motion of the camera from the IMU data?

ros topic echo /imu/data gives me the following data:

header:
  stamp:
    sec: 1671198831
    nanosec: 132886528
  frame_id: camera_imu_optical_frame
orientation:
  x: -0.6028298717308699
  y: -0.5837287604852286
  z: -0.42123859126194174
  w: -0.3441147034972153
orientation_covariance:
- 0.0
- 0.0
- 0.0
- 0.0
- 0.0
- 0.0
- 0.0
- 0.0
- 0.0
angular_velocity:
  x: -0.005235987715423107
  y: 0.0
  z: 0.0
angular_velocity_covariance:
- 0.01
- 0.0
- 0.0
- 0.0
- 0.01
- 0.0
- 0.0
- 0.0
- 0.01
linear_acceleration:
  x: 1.0493115186691284
  y: 8.953471183776855
  z: -4.001112937927246
linear_acceleration_covariance:
- 0.01
- 0.0
- 0.0
- 0.0
- 0.01
- 0.0
- 0.0
- 0.0
- 0.01

Am I supposed to write a custom node that double integrates the linear_acceleration values to get the x, y and z motion? Or is there an existing package/node that I can make use of?

mintar commented 1 year ago

IMU data is mostly useful for orientation. To get the position, you would have to integrate the linear acceleration twice: Once to get the velocity, and once more to get the position. Any small noise in the acceleration leads to an offset in the velocity, so the position will drift away continuously even if the IMU is stationary.

I don't know what robot you are using, but for wheeled robots, the standard procedure is to fuse the IMU data with the wheel odometry. The nice thing about this is that IMU and wheel odometry have complementary strengths and weaknesses: the IMU is good at estimating rotation, and the wheel odometry is good at estimating translation.

The standard package to use for fusing wheel odometry and IMU data is robot_localization.

You should feed two kinds of data into robot_localization:

Then once that's working, if your robot has a laser scanner or similar (can also be simulated with the realsense), you can layer a localization method such as AMCL on top.

tsadarsh commented 1 year ago

Thank you @mintar for the detailed response. This clarifies ✅ my query and I now understand how to get odometer data for my robot. I will use the IMU data and wheel encoder data to get the position and rotation data by fusing them using the ROS robot_localization package 📦.

Thank you once again 😊

mintar commented 1 year ago

Glad I could help!