In using this driver for our UM7, we noticed that the data that's output isn't as expected. I collected a bagfile and plotted the data (using tf.transformations to switch from euler to quaternion). The robot is only moving in 2D; the IMU is mounted parallel to the floor (as in the Hardware Installation section at http://wiki.ros.org/um7). All of the non-noise motions show up as roll, though with the same magnitudes as expected from yaw, as calculated from our mapping framework primarily using lidar data:
Has anyone else seen this issue? We're using the standard tf_ned_to_enu parameter. Our version of the sensor is redshift labs' RSX-UM7 and is running firmware version U72B.
I'm going to make a patch for our system that handles the quaternions correctly for our sensor -- I am happy to upstream it if that would be helpful. I will likely add an option that outputs the IMU data in the sensor frame; we'll handle that by having the urdf know that the IMU frame is flipped upside down.
In using this driver for our UM7, we noticed that the data that's output isn't as expected. I collected a bagfile and plotted the data (using tf.transformations to switch from euler to quaternion). The robot is only moving in 2D; the IMU is mounted parallel to the floor (as in the Hardware Installation section at http://wiki.ros.org/um7). All of the non-noise motions show up as roll, though with the same magnitudes as expected from yaw, as calculated from our mapping framework primarily using lidar data:
Has anyone else seen this issue? We're using the standard tf_ned_to_enu parameter. Our version of the sensor is redshift labs' RSX-UM7 and is running firmware version U72B.
I'm going to make a patch for our system that handles the quaternions correctly for our sensor -- I am happy to upstream it if that would be helpful. I will likely add an option that outputs the IMU data in the sensor frame; we'll handle that by having the urdf know that the IMU frame is flipped upside down.