IntelRealSense / realsense-ros

ROS Wrapper for Intel(R) RealSense(TM) Cameras
http://wiki.ros.org/RealSense
Apache License 2.0
2.52k stars 1.74k forks source link

D435i IMU uniting modes doesn't publish anything on the /camera/imu channel #598

Closed SteveMacenski closed 5 years ago

SteveMacenski commented 5 years ago

tried mode linear interpolation, and copy. Tried on firmwares 5.11.1.0, 5..9.13.0, and 5.10.13.0.

With the latest librealsense + realsense2_camera releases (actually tried both librealsense2 2.17.1 and 2.18.0)

SteveMacenski commented 5 years ago

With warning

 04/02 15:04:52,241 WARNING [140398924134144] (backend-hid.cpp:595) iio_hid_sensor: Frames didn't arrived within 5 seconds

popping up

doronhi commented 5 years ago

Usually the message of "Frames didn't arrived within 5 seconds" is related to a firmware problem and requires you plug the camera out and in again. As I assume you try that already, is there anything else going on? Especially weak machine? Could you add the command line and the LOG file, maybe there are some hints there?

SteveMacenski commented 5 years ago

This is just running rs_camera.launch on an 8th gen i7 machine with the only args unite_imu_method, it was also attempted with a bunch of other flags setting the IR and depth rates/resolutions to a desired value with the same issue. The only time I see this error is when unite_imu_method is enabled (for this particular case). I tried with the listed firmwares and versions - plenty of CPU and memory around to use.

I also did try to unplug and replugin the camera.

doronhi commented 5 years ago

I updated my comment, is it possible to publish the log file (up to the first appearance of the "5 seconds" warning?

SteveMacenski commented 5 years ago

Full text:

roslaunch realsense2_camera rs_camera.launch  unite_imu_method:=linear_interpolation
... logging to /home/simbe/.ros/log/9967a054-266c-11e9-b4d8-408d5ca13132/roslaunch-stevenuc-29065.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://stevenuc:43615/

SUMMARY
========

PARAMETERS
 * /camera/realsense2_camera/accel_fps: 250
 * /camera/realsense2_camera/accel_optical_frame_id: camera_accel_opti...
 * /camera/realsense2_camera/align_depth: False
 * /camera/realsense2_camera/aligned_depth_to_color_frame_id: camera_aligned_de...
 * /camera/realsense2_camera/aligned_depth_to_fisheye_frame_id: camera_aligned_de...
 * /camera/realsense2_camera/aligned_depth_to_infra1_frame_id: camera_aligned_de...
 * /camera/realsense2_camera/aligned_depth_to_infra2_frame_id: camera_aligned_de...
 * /camera/realsense2_camera/base_frame_id: camera_link
 * /camera/realsense2_camera/clip_distance: -2.0
 * /camera/realsense2_camera/color_fps: 30
 * /camera/realsense2_camera/color_frame_id: camera_color_frame
 * /camera/realsense2_camera/color_height: 480
 * /camera/realsense2_camera/color_optical_frame_id: camera_color_opti...
 * /camera/realsense2_camera/color_width: 640
 * /camera/realsense2_camera/depth_fps: 30
 * /camera/realsense2_camera/depth_frame_id: camera_depth_frame
 * /camera/realsense2_camera/depth_height: 480
 * /camera/realsense2_camera/depth_optical_frame_id: camera_depth_opti...
 * /camera/realsense2_camera/depth_width: 640
 * /camera/realsense2_camera/enable_color: True
 * /camera/realsense2_camera/enable_depth: True
 * /camera/realsense2_camera/enable_fisheye: True
 * /camera/realsense2_camera/enable_imu: True
 * /camera/realsense2_camera/enable_infra1: True
 * /camera/realsense2_camera/enable_infra2: True
 * /camera/realsense2_camera/enable_pointcloud: False
 * /camera/realsense2_camera/enable_sync: False
 * /camera/realsense2_camera/filters: 
 * /camera/realsense2_camera/fisheye_fps: 30
 * /camera/realsense2_camera/fisheye_height: 480
 * /camera/realsense2_camera/fisheye_optical_frame_id: camera_fisheye_op...
 * /camera/realsense2_camera/fisheye_width: 640
 * /camera/realsense2_camera/gyro_fps: 400
 * /camera/realsense2_camera/gyro_optical_frame_id: camera_gyro_optic...
 * /camera/realsense2_camera/infra1_fps: 30
 * /camera/realsense2_camera/infra1_frame_id: camera_infra1_frame
 * /camera/realsense2_camera/infra1_height: 480
 * /camera/realsense2_camera/infra1_optical_frame_id: camera_infra1_opt...
 * /camera/realsense2_camera/infra1_width: 640
 * /camera/realsense2_camera/infra2_fps: 30
 * /camera/realsense2_camera/infra2_frame_id: camera_infra2_frame
 * /camera/realsense2_camera/infra2_height: 480
 * /camera/realsense2_camera/infra2_optical_frame_id: camera_infra2_opt...
 * /camera/realsense2_camera/infra2_width: 640
 * /camera/realsense2_camera/initial_reset: False
 * /camera/realsense2_camera/json_file_path: 
 * /camera/realsense2_camera/linear_accel_cov: 0.01
 * /camera/realsense2_camera/pointcloud_texture_index: 0
 * /camera/realsense2_camera/pointcloud_texture_stream: RS2_STREAM_COLOR
 * /camera/realsense2_camera/rosbag_filename: 
 * /camera/realsense2_camera/serial_no: 
 * /camera/realsense2_camera/unite_imu_method: linear_interpolation
 * /rosdistro: kinetic
 * /rosversion: 1.12.14

NODES
  /camera/
    realsense2_camera (nodelet/nodelet)
    realsense2_camera_manager (nodelet/nodelet)

ROS_MASTER_URI=http://localhost:11311

process[camera/realsense2_camera_manager-1]: started with pid [29082]
process[camera/realsense2_camera-2]: started with pid [29083]
[ INFO] [1549474683.357969059]: Initializing nodelet with 4 worker threads.
[ INFO] [1549474683.547027899]: RealSense ROS v2.1.4
[ INFO] [1549474683.547060098]: Running with LibRealSense v2.18.0
 06/02 09:38:03,783 WARNING [139935124055936] (types.cpp:57) hwmon command 0x4f failed. Error type: No data to return (-21).
[ INFO] [1549474683.819617467]: getParameters...
[ INFO] [1549474683.900642912]: setupDevice...
[ INFO] [1549474683.900680627]: JSON file is not provided
[ INFO] [1549474683.900709369]: ROS Node Namespace: camera
[ INFO] [1549474683.900734479]: Device Name: Intel RealSense D435I
[ INFO] [1549474683.900751289]: Device Serial No: 843112072158
[ INFO] [1549474683.900776746]: Device FW version: 05.10.13.00
255.255.255.255
[ INFO] [1549474683.900798748]: Device Product ID: 0x0B3A
[ INFO] [1549474683.900814328]: Enable PointCloud: Off
[ INFO] [1549474683.900830584]: Align Depth: Off
[ INFO] [1549474683.900842984]: Sync Mode: Off
[ INFO] [1549474683.901119237]: Device Sensors: 
[ INFO] [1549474683.901272403]: Stereo Module was found.
[ INFO] [1549474683.901313225]: RGB Camera was found.
[ INFO] [1549474683.901340435]: Motion Module was found.
[ INFO] [1549474683.901513567]: (Fisheye, 0) sensor isn't supported by current device! -- Skipping...
[ INFO] [1549474683.901743110]: setupPublishers...
[ INFO] [1549474683.905632604]: Expected frequency for depth = 30.00000
[ INFO] [1549474683.946876155]: Expected frequency for infra1 = 30.00000
[ INFO] [1549474683.982754209]: Expected frequency for infra2 = 30.00000
[ INFO] [1549474684.021152959]: Expected frequency for color = 30.00000
[ INFO] [1549474684.064852418]: Start publisher IMU
[ INFO] [1549474684.066568024]: setupStreams...
 06/02 09:38:04,068 WARNING [139935124055936] (sensor.cpp:338) Unregistered Media formats : [ UYVY ]; Supported: [ ]
[ INFO] [1549474684.075337406]: depth stream is enabled - width: 640, height: 480, fps: 30
[ INFO] [1549474684.075737449]: infra1 stream is enabled - width: 640, height: 480, fps: 30
[ INFO] [1549474684.076031362]: infra2 stream is enabled - width: 640, height: 480, fps: 30
[ INFO] [1549474684.084764022]: color stream is enabled - width: 640, height: 480, fps: 30
[ INFO] [1549474684.102436231]: starting imu...
[ INFO] [1549474684.104658283]: Gyro and accelometer are enabled and combined to IMU message at 500 fps by method:LINEAR_INTERPOLATION
 06/02 09:38:04,106 WARNING [139934219003648] (backend-hid.h:89) write_fs_arithmetic Could not change 0 to 1 : path /sys/devices/pci0000:00/0000:00:14.0/usb2/2-1/2-1:1.5/0003:8086:0B3A.0019/HID-SENSOR-200073.1.auto/iio:device0/buffer/enable
 06/02 09:38:04,106 WARNING [139934219003648] (backend-hid.cpp:692) HID set_power 1 failed for /sys/devices/pci0000:00/0000:00:14.0/usb2/2-1/2-1:1.5/0003:8086:0B3A.0019/HID-SENSOR-200073.1.auto/iio:device0/buffer/enable
[ INFO] [1549474684.108500929]: num_filters: 0
[ INFO] [1549474684.109049856]: RealSense Node Is Up!
[ INFO] [1549474684.109075780]: Setting Dynamic reconfig parameters.
[ INFO] [1549474684.174738680]: Done Setting Dynamic reconfig parameters.
 06/02 09:38:09,107 WARNING [139934210610944] (backend-hid.cpp:598) iio_hid_sensor: Frames didn't arrived within 5 seconds
 06/02 09:38:14,109 WARNING [139934210610944] (backend-hid.cpp:598) iio_hid_sensor: Frames didn't arrived within 5 seconds
 06/02 09:38:19,109 WARNING [139934210610944] (backend-hid.cpp:598) iio_hid_sensor: Frames didn't arrived within 5 seconds
doronhi commented 5 years ago

Please check out issue #2965 in librealsense repository: https://github.com/IntelRealSense/librealsense/issues/2965 At the end there is a PR to tackle this issue. It will be merged into the new librealsense version (2.18.1).

SteveMacenski commented 5 years ago

@doronhi thanks for the followup. Any knowledge of the timeframe we should expect to see that released? I'm also curious why it seems to work for some users on the same platform and not for others.

doronhi commented 5 years ago

No problem. Librealsense v2.18.1 was released 2 hours ago.

SteveMacenski commented 5 years ago

Oh, awesome, let me test that and report back on the ticket for closure

SteveMacenski commented 5 years ago

I'm now getting data from it in the v2.18.1 prerelease tag! However, there's some issues on the ROS side

---
header: 
  seq: 2684
  stamp: 
    secs: 1549571914
    nsecs: 436936140
  frame_id: "camera_accel_frame"
orientation: 
  x: 0.0
  y: 0.0
  z: 0.0
  w: 0.0
orientation_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity: 
  x: 0.0
  y: 0.0069813169539
  z: -0.00872664619237
angular_velocity_covariance: [0.01, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.01]
linear_acceleration: 
  x: 7.78209114075
  y: -1.38978075981
  z: 5.76691913605
linear_acceleration_covariance: [0.009999999776482582, 0.0, 0.0, 0.0, 0.009999999776482582, 0.0, 0.0, 0.0, 0.009999999776482582]
---
header: 
  seq: 2685
  stamp: 
    secs: 1549571914
    nsecs: 441330194
  frame_id: "camera_gyro_frame"
orientation: 
  x: 0.0
  y: 0.0
  z: 0.0
  w: 0.0
orientation_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity: 
  x: -0.00307004665956
  y: 0.0100513640791
  z: -0.00565659906715
angular_velocity_covariance: [0.01, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.01]
linear_acceleration: 
  x: 7.7635178566
  y: -1.31868362427
  z: 5.74834632874
linear_acceleration_covariance: [0.009999999776482582, 0.0, 0.0, 0.0, 0.009999999776482582, 0.0, 0.0, 0.0, 0.009999999776482582]

We have 2 different frames being published with (seemingly) identical values (offset for sampling). Can we homologate this to be in the camera_imu_frame or something similar? That would be challenging to work with switching frames back and forth

doronhi commented 5 years ago

Hi, The problem is that the sensor produces 2 topics, gyro and accel. They are totally independent and not synchronized. I’m not sure on which topics you are listening but there are 3 options:

  1. Default: you get 2 topics, camera/gyro/sample and /camera/accel/sample. They use the IMU message but fill only their part.
  2. You can state unite_imu_method:=copy. That will produce one topic named /camera/imu/sample where the field of gyro or accel gets to be updated and the other is a copy.
  3. Stating unite_imu_method:=linear_interpolation will also create a unified IMU topic where for each original reading one section is the original and the other, an interpolation.

I think that if you need a single topic and you’re not going to implement your own logic for combining the 2 sensors you should try option number 3.

SteveMacenski commented 5 years ago

I'm doing option 3, but when you interpolate both between each other, you're reporting double-counted values from 1 measurement, which isn't good.

What I think should be happening is that it picks 1 of the sensors (the one synced with the IR images ideally) and interpolate the other and report them both out. Either way, we should definitely not have reporting multiple virtual readings from 1 actual hardware read.

ie if gyro is hardware synced, interpolate accelerometer values to fit and use them. Otherwise you're double counting 1 read and then the imu topic's results are only synced with the IR images for 50% of the messages belonging to the gyro frame, and many applications are prebuilt such that they're expecting 1 given frame ID or there's no way to filter out a bad frame. Think robot_localization: I'm trying to fuse sensors for a state estimation, and I can't tell it to just ignore 50% of the messages that aren't synced correctly to the other sensors inputs.

doronhi commented 5 years ago

Maybe I misunderstand you. Both the Accel sensor and the Gyro sensor are independent and neither is synced with the IR sensor. Consider the fact that they all work in very different frequencies. So what makes one of the the base sensor for synchronization? The linear_interpolation option doesn't report the same reading multiple times, unless you refer to the fact that a single reading is reported one time as it is and another as part of an interpolation with the following reading. Trying to sync the accel and gyro messages to the IR messages means to reduce the frequency of the IMU sensors to that of the IR. Most applications, I believe, would not want that and those that do can interpolate the original messages as they find fit. Some would even consider the interpolation done here to be a disruption of the original data and for those we maintain the original topics without any attempt at unification.

As for the frame_id, there is a mix-up and probably a bug in the code. For now, if you use either method to unify the imu messages you should give both "imu_gyro_frame_id" and "imu_accel_frame_id" parameters the same value, same way as the default values are both set to "camera_imu_frame".

SteveMacenski commented 5 years ago

We have the gyro and the accelerometer that share the same clock as the IR hardware (for some reason I was under the impression only 1 of them was time synchronized with the IR hardware, if that's wrong, we can ignore that part). When we sample those devices and trying to do something like visual slam or odometry, I want to give that data to my algorithms or filters to update a model. When we interpolate, between the gyro for the accelerometer data and the accelerometer data for the gyro readings, we're essentially sending the same value twice to the filter - the first when we sample it and the second when the other piece of hardware interpolates.

This is probably not good for these types of algorithms and its "double counting" single hardware reads. I would expect decrease performance due to this - those algorithms don't expect to get derived values of previous data in a new cycle (ie if there's an outlier, now we're updating it into the filter twice that could lead to divergence from just 1 or 2 points of high variance from the actual state).

My suggestion if he rates of measurement are both high is to chose one, publish messages at that rate and interoperate for the other. My preference would be for the gyro to be the "master" sensor and the accelerometer to be the "slave" sensor, since many of these algorithms are counting on the gyro to be more accurate and reliable than the accelerometer is anyhow. A typical covariance on an accelerometer in my experience is e-2-ish while a gyro is e-5-ish.

doronhi commented 5 years ago

Hi @SteveMacenski , Sorry for the delay. I think you can check if your idea indeed improves the SLAM performance: in base_realsense_node.cpp, in function imu_callback_sync(), 3 lines from the end of the function replace: _synced_imu_publisher->Publish(imu_msg);

with: if (ACCEL== stream_index) { _synced_imu_publisher->Publish(imu_msg); } Now the only messages published are the ones where ACCEL is interpulated and GYRO is the original. (the algorithm is as follows - every time a new gyro message arrives, it interpolates it with the previous at the last accel message's time. Then combines with that accel message and send it)

There is a problem though: gyro arrives at 400Hz and accel only at 250Hz. What should the algorithm so with 2 consecutive gyro messages? For now it discards the earlier one and you will get the messages at 250Hz. Is there an improvement?

SteveMacenski commented 5 years ago

to your last question: It should be in a kalmon filter most likely. I'm fairly certain that its not good that its returning all the individual readings 2x on the same topic, that's generally bad practice.

Shashika007 commented 5 years ago

@SteveMacenski @doronhi I have faced to the problem that imu accl and gyro do not publish with same time delay. As following figures the time delays between two readings are different to previous readings. Do you guys have any idea about this issue like how to fix this issue? the time gap should be same between each reading. is't it ? I already turned "enable_sync" and "initial_reset" to true.

gyro and accl (unite_imu_method: default) screenshot from 2019-02-21 09-35-28 screenshot from 2019-02-21 09-36-08

IMU readings (unite_imu_method: linear_interpolation) and after Madgwick filter . screenshot from 2019-02-20 18-01-36 screenshot from 2019-02-20 18-04-08

RealSenseCustomerSupport commented 5 years ago

[Please Ignore - RealSense system comment]

RealSenseCustomerSupport commented 5 years ago

Hi @Shashika007,

Check below with more info about the launch parameters. https://github.com/IntelRealSense/realsense-ros

enable_sync: gathers closest frames of different sensors, infra red, color and depth, to be sent with the same timetag. This happens automatically when such filters as pointcloud are enabled. initial_reset: On occasions the device was not closed properly and due to firmware issues needs to reset. If set to true, the device will reset prior to usage.

doronhi commented 5 years ago

Could also be related to #631

RealSenseCustomerSupport commented 5 years ago

Hi @SteveMacenski,

Still need any support for this topic? If not, will close this.

arindamsaha81 commented 4 years ago

Hi, The problem is that the sensor produces 2 topics, gyro and accel. They are totally independent and not synchronized. I’m not sure on which topics you are listening but there are 3 options:

1. Default: you get 2 topics, camera/gyro/sample and /camera/accel/sample. They use the IMU message but fill only their part.

2. You can state unite_imu_method:=copy. That will produce one topic named /camera/imu/sample where the field of gyro or accel gets to be updated and the other is a copy.

3. Stating unite_imu_method:=linear_interpolation will also create a unified IMU topic where for each original reading one section is the original and the other, an interpolation.

I think that if you need a single topic and you’re not going to implement your own logic for combining the 2 sensors you should try option number 3.

Hi @doronhi

I have a recorded rosbag where /camera/accel/sample and /camera/gyro/sample topics are recorded. I have a SLAM algorithm that requires the combined message on a single topic like /camera/imu. Therfore, is there any c/c++ code or script that combine these two values similar like linear_interpolation offline?

doronhi commented 4 years ago

As for a c++ code, your are welcome to use the code in the realsense2_camera node and build your own converter. I am not aware of any ROS node that does that or, I think, I would never have introduced that capability into realsense2_camera in the first place. Unfortunately, realsense2_camera can upload and play bag files recorded by realsense-viewer and not those that were recorded by rosbag. They are slightly different, so I think that is not an option for you too.

arindamsaha81 commented 4 years ago

As for a c++ code, your are welcome to use the code in the realsense2_camera node and build your own converter. I am not aware of any ROS node that does that or, I think, I would never have introduced that capability into realsense2_camera in the first place. Unfortunately, realsense2_camera can upload and play bag files recorded by realsense-viewer and not those that were recorded by rosbag. They are slightly different, so I think that is not an option for you too.

@doronhi Thanks a lot. Can you please point out the portion of the code (file names and line numbers) in realsense2_camera node that actually does the interpolation.

doronhi commented 4 years ago

The function "FillImuData_LinearInterpolation" in base_realsense_node.cpp, line 1188