introlab / rtabmap_ros

RTAB-Map's ROS package.
http://wiki.ros.org/rtabmap_ros
BSD 3-Clause "New" or "Revised" License
953 stars 556 forks source link

Xtion PRO live : demo_two_kinects.launch #187

Closed hildaaw closed 7 years ago

hildaaw commented 7 years ago

Hi, recently i have tried to run demo_two_kinects.launch . Since i want this file to work on 2 Xtion Pro live cameras instead of 2 Kinects cameras, I tried manipulate the file from: `

` To: ` ` After that i tried to run this launch file and i received these warnings: > [ WARN] [1500530758.583836090]: /rtabmap/rgbd_odometry: Did not receive data since 5 seconds! Make sure the input topics are published ("$ rostopic hz my_topic") and the timestamps in their header are set. > /rtabmap/rgbd_odometry subscribed to (approx sync): > /camera1/rgbd_image, > /camera2/rgbd_image > [ WARN] [1500530758.875415201]: /rtabmap/rtabmapviz: Did not receive data since 5 seconds! Make sure the input topics are published ("$ rostopic hz my_topic") and the timestamps in their header are set. If topics are coming from different computers, make sure the clocks of the computers are synchronized ("ntpdate"). If topics are not published at the same rate, you could increase "queue_size" parameter (current=10). > /rtabmap/rtabmapviz subscribed to (approx sync): > /rtabmap/odom, > /camera1/rgbd_image, > /camera2/rgbd_image, > /rtabmap/odom_info Both of the cameras are connected successfully as the IR sensor lights are up. May I know what do i need to add or remove so that it can calibrate? Thank you in advance!
matlabbe commented 7 years ago

Can you verify if the topics mentioned are published?

$ rostopic hz /camera1/rgbd_image
$ rostopic hz /camera2/rgbd_image
hildaaw commented 7 years ago

I have tried to verify the topics mentioned and i received this:

WARNING: topic [/camera1/rgbd_image] does not appear to be published yet WARNING: topic [/camera2/rgbd_image] does not appear to be published yet

I tried running demo_two_kinects.launch before verifying and i received this:

ERROR: Cannot load message class for [rtabmap_ros/RGBDImage]. Are your messages built?

The message file indeed exist and the CMakeList.txt did indicate the message file. I clean my project by running catkin_make clean than remake my project with catkin_make -j1. I tried to verify the topic once again and i received this:

subscribed to [/camera1/rgbd_image] subscribed to [/camera2/rgbd_image]

Severn-Vergil commented 7 years ago

I guess I have a similar problem. #190 The topics /camera1/rgbd_image and /camera2/rgbd_image are not published.

matlabbe commented 7 years ago

So rgbd_image topics are advertised but not published. We may need to look at if the rgbd_sync inputs are published:

$ rostopic hz /camera1/rgb/image_rect_color
$ rostopic hz /camera1/depth_registered/image_raw
$ rostopic hz /camera1/rgb/camera_info

$ rostopic hz /camera2/rgb/image_rect_color
$ rostopic hz /camera2/depth_registered/image_raw
$ rostopic hz /camera2/rgb/camera_info

I updated rgbd_sync code to output warnings if inputs are not received for convenience.

cheers, Mathieu

hildaaw commented 7 years ago

I have updated rtabmap_ros and look into the rgbd_sync input and these are the results:

WARNING: topic [/camera1/rgb/image_rect_color] does not appear to be published yet WARNING: topic [/camera1/depth_registered/image_raw] does not appear to be published yet WARNING: topic [/camera1/rgb/camera_info] does not appear to be published yet

WARNING: topic [/camera2/rgb/image_rect_color] does not appear to be published yet WARNING: topic [/camera2/depth_registered/image_raw] does not appear to be published yet WARNING: topic [/camera2/rgb/camera_info] does not appear to be published yet

matlabbe commented 7 years ago

That means the camera drivers are not working. You can also use rqt_image_view to actually see the raw images that are supposed to be published.

hildaaw commented 7 years ago

My apologies. It seems that i made a mistake, i forgotten to run the launch file, that's why the camera drivers isn't running. After launching demo_two_kinects.launch i got these:

subscribed to [/camera1/rgb/image_rect_color] subscribed to [/camera1/depth_registered/image_raw] subscribed to [/camera1/rgb/camera_info]

subscribed to [/camera2/rgb/image_rect_color] subscribed to [/camera2/depth_registered/image_raw] subscribed to [/camera2/rgb/camera_info]

With that, when i run demo_two_kinects.launch i received this error:

[ INFO] [1501557151.446369561]: Odom: quality=0, std dev=0.000000m|0.000000rad, update time=0.042123s [ERROR] (2017-08-01 11:12:31.493) RegistrationVis.cpp:1156::computeTransformationImpl() Calibrated camera required (multi-cameras not supported). Id=5 Models=2 StereoModel=0 weight=0 [ WARN] (2017-08-01 11:12:31.493) OdometryF2M.cpp:384::computeTransform() Unknown registration error [ INFO] [1501557151.494065962]: Odom: quality=0, std dev=0.000000m|0.000000rad, update time=0.041377s [ WARN] [1501557151.536255466]: Could not get transform from odom to base_link after 0,200000 seconds (for stamp=1501557151,195864)! Error="Lookup would require extrapolation at time 1501557151.195864431, but only time 1501557151.129126541 is in the buffer, when looking up transform from frame [base_link] to frame [odom]. canTransform returned after 0.202024 timeout was 0.2.". [ERROR] (2017-08-01 11:12:31.541) RegistrationVis.cpp:1156::computeTransformationImpl() Calibrated camera required (multi-cameras not supported). Id=6 Models=2 StereoModel=0 weight=0

screenshot from 2017-08-01 10 33 07

It felt like the 3D mapping part isn't functioning, is it because i need to manipulate the /tf? my cameras are not calibrated? What is your opinion? Again, thanks for the help..

hildaaw commented 7 years ago

Update: I have tried the software on XBOX 360 and i received the same error.

 logging to /home/ia/.ros/log/5408978e-8230-11e7-b28b-d43d7e05ed8d/roslaunch-ia-MS-7759-23600.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://ia-MS-7759:43401/

SUMMARY
========

PARAMETERS
 * /camera1/camera1_nodelet_manager/num_worker_threads: 4
 * /camera1/depth_rectify_depth/interpolation: 0
 * /camera1/depth_registered_rectify_depth/interpolation: 0
 * /camera1/disparity_depth/max_range: 4.0
 * /camera1/disparity_depth/min_range: 0.5
 * /camera1/disparity_registered_hw/max_range: 4.0
 * /camera1/disparity_registered_hw/min_range: 0.5
 * /camera1/disparity_registered_sw/max_range: 4.0
 * /camera1/disparity_registered_sw/min_range: 0.5
 * /camera1/driver/data_skip: 0
 * /camera1/driver/debug: False
 * /camera1/driver/depth_camera_info_url: 
 * /camera1/driver/depth_frame_id: camera1_depth_opt...
 * /camera1/driver/depth_registration: True
 * /camera1/driver/device_id: #1
 * /camera1/driver/diagnostics_max_frequency: 30.0
 * /camera1/driver/diagnostics_min_frequency: 30.0
 * /camera1/driver/diagnostics_tolerance: 0.05
 * /camera1/driver/diagnostics_window_time: 5.0
 * /camera1/driver/enable_depth_diagnostics: False
 * /camera1/driver/enable_ir_diagnostics: False
 * /camera1/driver/enable_rgb_diagnostics: False
 * /camera1/driver/rgb_camera_info_url: 
 * /camera1/driver/rgb_frame_id: camera1_rgb_optic...
 * /camera2/camera2_nodelet_manager/num_worker_threads: 4
 * /camera2/depth_rectify_depth/interpolation: 0
 * /camera2/depth_registered_rectify_depth/interpolation: 0
 * /camera2/disparity_depth/max_range: 4.0
 * /camera2/disparity_depth/min_range: 0.5
 * /camera2/disparity_registered_hw/max_range: 4.0
 * /camera2/disparity_registered_hw/min_range: 0.5
 * /camera2/disparity_registered_sw/max_range: 4.0
 * /camera2/disparity_registered_sw/min_range: 0.5
 * /camera2/driver/data_skip: 0
 * /camera2/driver/debug: False
 * /camera2/driver/depth_camera_info_url: 
 * /camera2/driver/depth_frame_id: camera2_depth_opt...
 * /camera2/driver/depth_registration: True
 * /camera2/driver/device_id: #2
 * /camera2/driver/diagnostics_max_frequency: 30.0
 * /camera2/driver/diagnostics_min_frequency: 30.0
 * /camera2/driver/diagnostics_tolerance: 0.05
 * /camera2/driver/diagnostics_window_time: 5.0
 * /camera2/driver/enable_depth_diagnostics: False
 * /camera2/driver/enable_ir_diagnostics: False
 * /camera2/driver/enable_rgb_diagnostics: False
 * /camera2/driver/rgb_camera_info_url: 
 * /camera2/driver/rgb_frame_id: camera2_rgb_optic...
 * /rosdistro: indigo
 * /rosversion: 1.11.21
 * /rtabmap/rgbd_odometry/Odom/FillInfoData: true
 * /rtabmap/rgbd_odometry/Odom/Strategy: 0
 * /rtabmap/rgbd_odometry/OdomF2M/MaxSize: 1000
 * /rtabmap/rgbd_odometry/Vis/CorNNType: 3
 * /rtabmap/rgbd_odometry/Vis/FeatureType: 6
 * /rtabmap/rgbd_odometry/Vis/InlierDistance: 0.02
 * /rtabmap/rgbd_odometry/Vis/MaxDepth: 4.0
 * /rtabmap/rgbd_odometry/Vis/MinInliers: 20
 * /rtabmap/rgbd_odometry/frame_id: base_link
 * /rtabmap/rgbd_odometry/rgbd_cameras: 2
 * /rtabmap/rgbd_odometry/subscribe_rgbd: True
 * /rtabmap/rgbd_odometry/wait_for_transform: True
 * /rtabmap/rtabmap/Grid/FromDepth: false
 * /rtabmap/rtabmap/Vis/InlierDistance: 0.02
 * /rtabmap/rtabmap/Vis/MinInliers: 10
 * /rtabmap/rtabmap/frame_id: base_link
 * /rtabmap/rtabmap/gen_scan: True
 * /rtabmap/rtabmap/map_negative_poses_ignored: False
 * /rtabmap/rtabmap/map_negative_scan_empty_ray_tracing: False
 * /rtabmap/rtabmap/rgbd_cameras: 2
 * /rtabmap/rtabmap/subscribe_depth: False
 * /rtabmap/rtabmap/subscribe_rgbd: True
 * /rtabmap/rtabmap/wait_for_transform: True
 * /rtabmap/rtabmapviz/frame_id: base_link
 * /rtabmap/rtabmapviz/rgbd_cameras: 2
 * /rtabmap/rtabmapviz/subscribe_depth: False
 * /rtabmap/rtabmapviz/subscribe_odom_info: True
 * /rtabmap/rtabmapviz/subscribe_rgbd: True
 * /rtabmap/rtabmapviz/wait_for_transform: True

NODES
  /camera2/
    camera2_nodelet_manager (nodelet/nodelet)
    depth_metric (nodelet/nodelet)
    depth_metric_rect (nodelet/nodelet)
    depth_points (nodelet/nodelet)
    depth_rectify_depth (nodelet/nodelet)
    depth_registered_hw_metric_rect (nodelet/nodelet)
    depth_registered_metric (nodelet/nodelet)
    depth_registered_rectify_depth (nodelet/nodelet)
    depth_registered_sw_metric_rect (nodelet/nodelet)
    disparity_depth (nodelet/nodelet)
    disparity_registered_hw (nodelet/nodelet)
    disparity_registered_sw (nodelet/nodelet)
    driver (nodelet/nodelet)
    ir_rectify_ir (nodelet/nodelet)
    points_xyzrgb_hw_registered (nodelet/nodelet)
    points_xyzrgb_sw_registered (nodelet/nodelet)
    register_depth_rgb (nodelet/nodelet)
    rgb_debayer (nodelet/nodelet)
    rgb_rectify_color (nodelet/nodelet)
    rgb_rectify_mono (nodelet/nodelet)
    rgbd_sync (nodelet/nodelet)
  /camera1/
    camera1_nodelet_manager (nodelet/nodelet)
    depth_metric (nodelet/nodelet)
    depth_metric_rect (nodelet/nodelet)
    depth_points (nodelet/nodelet)
    depth_rectify_depth (nodelet/nodelet)
    depth_registered_hw_metric_rect (nodelet/nodelet)
    depth_registered_metric (nodelet/nodelet)
    depth_registered_rectify_depth (nodelet/nodelet)
    depth_registered_sw_metric_rect (nodelet/nodelet)
    disparity_depth (nodelet/nodelet)
    disparity_registered_hw (nodelet/nodelet)
    disparity_registered_sw (nodelet/nodelet)
    driver (nodelet/nodelet)
    ir_rectify_ir (nodelet/nodelet)
    points_xyzrgb_hw_registered (nodelet/nodelet)
    points_xyzrgb_sw_registered (nodelet/nodelet)
    register_depth_rgb (nodelet/nodelet)
    rgb_debayer (nodelet/nodelet)
    rgb_rectify_color (nodelet/nodelet)
    rgb_rectify_mono (nodelet/nodelet)
    rgbd_sync (nodelet/nodelet)
  /rtabmap/
    rgbd_odometry (rtabmap_ros/rgbd_odometry)
    rtabmap (rtabmap_ros/rtabmap)
    rtabmapviz (rtabmap_ros/rtabmapviz)
  /
    base_to_camera1_tf (tf/static_transform_publisher)
    base_to_camera2_tf (tf/static_transform_publisher)
    camera1_base_link (tf/static_transform_publisher)
    camera1_base_link1 (tf/static_transform_publisher)
    camera1_base_link2 (tf/static_transform_publisher)
    camera1_base_link3 (tf/static_transform_publisher)
    camera2_base_link (tf/static_transform_publisher)
    camera2_base_link1 (tf/static_transform_publisher)
    camera2_base_link2 (tf/static_transform_publisher)
    camera2_base_link3 (tf/static_transform_publisher)

auto-starting new master
process[master]: started with pid [23612]
ROS_MASTER_URI=http://localhost:11311

setting /run_id to 5408978e-8230-11e7-b28b-d43d7e05ed8d
process[rosout-1]: started with pid [23625]
started core service [/rosout]
process[camera1/camera1_nodelet_manager-2]: started with pid [23642]
process[camera1/driver-3]: started with pid [23643]
process[camera1/rgb_debayer-4]: started with pid [23644]
process[camera1/rgb_rectify_mono-5]: started with pid [23646]
process[camera1/rgb_rectify_color-6]: started with pid [23652]
process[camera1/ir_rectify_ir-7]: started with pid [23664]
process[camera1/depth_rectify_depth-8]: started with pid [23679]
process[camera1/depth_metric_rect-9]: started with pid [23692]
process[camera1/depth_metric-10]: started with pid [23701]
process[camera1/depth_points-11]: started with pid [23706]
process[camera1/register_depth_rgb-12]: started with pid [23718]
process[camera1/points_xyzrgb_sw_registered-13]: started with pid [23728]
process[camera1/depth_registered_sw_metric_rect-14]: started with pid [23740]
process[camera1/depth_registered_rectify_depth-15]: started with pid [23748]
process[camera1/points_xyzrgb_hw_registered-16]: started with pid [23755]
process[camera1/depth_registered_hw_metric_rect-17]: started with pid [23769]
process[camera1/depth_registered_metric-18]: started with pid [23781]
process[camera1/disparity_depth-19]: started with pid [23793]
[ INFO] [1502852974.560732434]: Initializing nodelet with 4 worker threads.
process[camera1/disparity_registered_sw-20]: started with pid [23805]
process[camera1/disparity_registered_hw-21]: started with pid [23814]
process[camera1_base_link-22]: started with pid [23831]
process[camera1_base_link1-23]: started with pid [23833]
process[camera1_base_link2-24]: started with pid [23839]
process[camera1_base_link3-25]: started with pid [23848]
process[camera2/camera2_nodelet_manager-26]: started with pid [23855]
process[camera2/driver-27]: started with pid [23868]
process[camera2/rgb_debayer-28]: started with pid [23875]
process[camera2/rgb_rectify_mono-29]: started with pid [23883]
process[camera2/rgb_rectify_color-30]: started with pid [23889]
process[camera2/ir_rectify_ir-31]: started with pid [23901]
process[camera2/depth_rectify_depth-32]: started with pid [23913]
process[camera2/depth_metric_rect-33]: started with pid [23929]
process[camera2/depth_metric-34]: started with pid [23934]
process[camera2/depth_points-35]: started with pid [23948]
process[camera2/register_depth_rgb-36]: started with pid [23958]
process[camera2/points_xyzrgb_sw_registered-37]: started with pid [23962]
process[camera2/depth_registered_sw_metric_rect-38]: started with pid [23971]
process[camera2/depth_registered_rectify_depth-39]: started with pid [23979]
process[camera2/points_xyzrgb_hw_registered-40]: started with pid [23983]
process[camera2/depth_registered_hw_metric_rect-41]: started with pid [23990]
process[camera2/depth_registered_metric-42]: started with pid [24008]
process[camera2/disparity_depth-43]: started with pid [24020]
process[camera2/disparity_registered_sw-44]: started with pid [24048]
process[camera2/disparity_registered_hw-45]: started with pid [24053]
process[camera2_base_link-46]: started with pid [24061]
process[camera2_base_link1-47]: started with pid [24068]
process[camera2_base_link2-48]: started with pid [24081]
process[camera2_base_link3-49]: started with pid [24092]
process[base_to_camera1_tf-50]: started with pid [24115]
process[base_to_camera2_tf-51]: started with pid [24123]
process[camera1/rgbd_sync-52]: started with pid [24135]
process[camera2/rgbd_sync-53]: started with pid [24141]
process[rtabmap/rgbd_odometry-54]: started with pid [24154]
process[rtabmap/rtabmap-55]: started with pid [24173]
process[rtabmap/rtabmapviz-56]: started with pid [24181]
[ INFO] [1502852974.832806035]: Initializing nodelet with 4 worker threads.
[ INFO] [1502852974.887992409]: Starting node...
[ INFO] [1502852974.945464939]: Initializing nodelet with 8 worker threads.
[ INFO] [1502852974.948721608]: Initializing nodelet with 8 worker threads.
[ INFO] [1502852975.045810469]: Starting node...
[ INFO] [1502852975.139213233]: rtabmapviz: Using configuration from "/home/ia/catkin_ws/src/rabmap_ros/launch/config/rgbd_gui.ini"
[ INFO] [1502852975.251350156]: Number devices connected: 2
[ INFO] [1502852975.251435816]: 1. device on bus 000:00 is a Xbox NUI Camera (2ae) from Microsoft (45e) with serial id 'A00363A00723102A'
[ INFO] [1502852975.251648453]: 2. device on bus 000:00 is a Xbox NUI Camera (2ae) from Microsoft (45e) with serial id 'A00362914988102A'
[ INFO] [1502852975.252577735]: Searching for device with index = 1
[ INFO] [1502852975.338404115]: /rtabmap/rtabmap(maps): map_filter_radius          = 0.000000
[ INFO] [1502852975.338454232]: /rtabmap/rtabmap(maps): map_filter_angle           = 30.000000
[ INFO] [1502852975.338476160]: /rtabmap/rtabmap(maps): map_cleanup                = true
[ INFO] [1502852975.338491854]: /rtabmap/rtabmap(maps): map_negative_poses_ignored = false
[ INFO] [1502852975.338503136]: /rtabmap/rtabmap(maps): map_negative_scan_ray_tracing = false
[ INFO] [1502852975.338517099]: /rtabmap/rtabmap(maps): cloud_output_voxelized     = true
[ INFO] [1502852975.338530979]: /rtabmap/rtabmap(maps): cloud_subtract_filtering   = false
[ INFO] [1502852975.338544226]: /rtabmap/rtabmap(maps): cloud_subtract_filtering_min_neighbors = 2
[ INFO] [1502852975.348020154]: /rtabmap/rtabmap(maps): octomap_tree_depth         = 16
[ INFO] [1502852975.348061340]: /rtabmap/rtabmap(maps): octomap_occupancy_thr      = 0.500000
[ INFO] [1502852975.405544099]: rtabmap: frame_id      = base_link
[ INFO] [1502852975.405590418]: rtabmap: map_frame_id  = map
[ INFO] [1502852975.405616548]: rtabmap: tf_delay      = 0.050000
[ INFO] [1502852975.405639613]: rtabmap: tf_tolerance  = 0.100000
[ INFO] [1502852975.405659317]: rtabmap: odom_sensor_sync   = false
[ INFO] [1502852975.481078603]: Starting a 3s RGB and Depth stream flush.
[ INFO] [1502852975.481246472]: Opened 'Xbox NUI Camera' on bus 0:0 with serial number 'A00363A00723102A'
[ INFO] [1502852975.502686084]: Setting odometry parameter "Odom/FillInfoData"="true"
[ INFO] [1502852975.572366830]: Setting odometry parameter "Odom/Strategy"="0"
[ INFO] [1502852975.585211048]: Setting odometry parameter "OdomF2M/MaxSize"="1000"
[ INFO] [1502852975.600549186]: Setting RTAB-Map parameter "Grid/FromDepth"="false"
[ WARN] [1502852975.788184036]: Could not find any compatible depth output mode for 1. Falling back to default depth output mode 1.
[ INFO] [1502852975.823928805]: rgb_frame_id = 'camera1_rgb_optical_frame' 
[ INFO] [1502852975.823991834]: depth_frame_id = 'camera1_depth_optical_frame' 
[ WARN] [1502852975.883073787]: Camera calibration file /home/ia/.ros/camera_info/rgb_A00363A00723102A.yaml not found.
[ WARN] [1502852975.883174620]: Using default parameters for RGB camera calibration.
[ WARN] [1502852975.883211330]: Camera calibration file /home/ia/.ros/camera_info/depth_A00363A00723102A.yaml not found.
[ WARN] [1502852975.883235589]: Using default parameters for IR camera calibration.
[ INFO] [1502852975.968404221]: Reading parameters from the ROS server...
[ INFO] [1502852976.023629116]: Setting odometry parameter "Vis/CorNNType"="3"
[ INFO] [1502852976.047906869]: Setting odometry parameter "Vis/FeatureType"="6"
[ INFO] [1502852976.058288846]: Setting odometry parameter "Vis/InlierDistance"="0.02"
[ INFO] [1502852976.069873930]: Setting odometry parameter "Vis/MaxDepth"="4.0"
[ INFO] [1502852976.089974698]: Setting odometry parameter "Vis/MinInliers"="20"
[ INFO] [1502852976.408395969]: 
/rtabmap/rgbd_odometry subscribed to (approx sync):
   /camera1/rgbd_image,
   /camera2/rgbd_image
[ INFO] [1502852976.538179325]: Parameters read = 0
[ INFO] [1502852976.717871083]: /rtabmap/rtabmapviz: queue_size    = 10
[ INFO] [1502852976.717922789]: /rtabmap/rtabmapviz: rgbd_cameras = 2
[ INFO] [1502852976.717940518]: /rtabmap/rtabmapviz: approx_sync   = true
[ INFO] [1502852976.717976770]: Setup rgbd2 callback
[ INFO] [1502852976.746272128]: 
/rtabmap/rtabmapviz subscribed to (approx sync):
   /rtabmap/odom,
   /camera1/rgbd_image,
   /camera2/rgbd_image,
   /rtabmap/odom_info
[ INFO] [1502852976.746359038]: rtabmapviz started.
[ INFO] [1502852976.808622380]: Number devices connected: 2
[ INFO] [1502852976.808697767]: 1. device on bus 000:00 is a Xbox NUI Camera (2ae) from Microsoft (45e) with serial id 'A00363A00723102A'
[ INFO] [1502852976.808721951]: 2. device on bus 000:00 is a Xbox NUI Camera (2ae) from Microsoft (45e) with serial id 'A00362914988102A'
[ INFO] [1502852976.809894116]: Searching for device with index = 2
[ INFO] [1502852976.921135612]: Setting RTAB-Map parameter "Vis/InlierDistance"="0.02"
[ INFO] [1502852976.929018883]: Setting RTAB-Map parameter "Vis/MinInliers"="10"
[ INFO] [1502852976.967524330]: Starting a 3s RGB and Depth stream flush.
[ INFO] [1502852976.967721410]: Opened 'Xbox NUI Camera' on bus 0:0 with serial number 'A00362914988102A'
[ WARN] [1502852977.243483411]: Could not find any compatible depth output mode for 1. Falling back to default depth output mode 1.
[ INFO] [1502852977.262593867]: rgb_frame_id = 'camera2_rgb_optical_frame' 
[ INFO] [1502852977.262662418]: depth_frame_id = 'camera2_depth_optical_frame' 
[ WARN] [1502852977.285114278]: Camera calibration file /home/ia/.ros/camera_info/rgb_A00362914988102A.yaml not found.
[ WARN] [1502852977.285213497]: Using default parameters for RGB camera calibration.
[ WARN] [1502852977.285259656]: Camera calibration file /home/ia/.ros/camera_info/depth_A00362914988102A.yaml not found.
[ WARN] [1502852977.285294267]: Using default parameters for IR camera calibration.
[ INFO] [1502852977.320280750]: RTAB-Map detection rate = 1.000000 Hz
[ INFO] [1502852977.320416987]: rtabmap: Deleted database "/home/ia/.ros/rtabmap.db" (--delete_db_on_start or -d are set).
[ INFO] [1502852977.320439467]: rtabmap: Using database from "/home/ia/.ros/rtabmap.db".
[ INFO] [1502852977.623422993]: Approximate time sync = true
[ INFO] [1502852977.665236647]: Approximate time sync = true
[ INFO] [1502852977.701672812]: 
/camera1/rgbd_sync subscribed to (approx sync):
   /camera1/rgb/image_rect_color,
   /camera1/depth_registered/image_raw,
   /camera1/rgb/camera_info
[ INFO] [1502852977.754530478]: 
/camera2/rgbd_sync subscribed to (approx sync):
   /camera2/rgb/image_rect_color,
   /camera2/depth_registered/image_raw,
   /camera2/rgb/camera_info
[ INFO] [1502852978.314705141]: Odom: quality=0, std dev=99.995000m|99.995000rad, update time=0.042039s
[ERROR] (2017-08-16 11:09:38.363) RegistrationVis.cpp:1156::computeTransformationImpl() Calibrated camera required (multi-cameras not supported). Id=2 Models=2 StereoModel=0 weight=0
[ WARN] (2017-08-16 11:09:38.363) OdometryF2M.cpp:384::computeTransform() Unknown registration error
[ INFO] [1502852978.363649240]: Odom: quality=0, std dev=0.000000m|0.000000rad, update time=0.031676s
[ERROR] (2017-08-16 11:09:38.394) RegistrationVis.cpp:1156::computeTransformationImpl() Calibrated camera required (multi-cameras not supported). Id=3 Models=2 StereoModel=0 weight=0
[ WARN] (2017-08-16 11:09:38.394) OdometryF2M.cpp:384::computeTransform() Unknown registration error
[ INFO] [1502852978.395295769]: Odom: quality=0, std dev=0.000000m|0.000000rad, update time=0.027820s
[ERROR] (2017-08-16 11:09:38.424) RegistrationVis.cpp:1156::computeTransformationImpl() Calibrated camera required (multi-cameras not supported). Id=4 Models=2 StereoModel=0 weight=0
[ WARN] (2017-08-16 11:09:38.424) OdometryF2M.cpp:384::computeTransform() Unknown registration error
[ INFO] [1502852978.424913256]: Odom: quality=0, std dev=0.000000m|0.000000rad, update time=0.025984s
[ERROR] (2017-08-16 11:09:38.452) RegistrationVis.cpp:1156::computeTransformationImpl() Calibrated camera required (multi-cameras not supported). Id=5 Models=2 StereoModel=0 weight=0
[ WARN] (2017-08-16 11:09:38.452) OdometryF2M.cpp:384::computeTransform() Unknown registration error
[ INFO] [1502852978.453049646]: Odom: quality=0, std dev=0.000000m|0.000000rad, update time=0.024316s
[ INFO] [1502852978.482656276]: Stopping device RGB and Depth stream flush.
[ERROR] (2017-08-16 11:09:38.483) RegistrationVis.cpp:1156::computeTransformationImpl() Calibrated camera required (multi-cameras not supported). Id=6 Models=2 StereoModel=0 weight=0
[ WARN] (2017-08-16 11:09:38.483) OdometryF2M.cpp:384::computeTransform() Unknown registration error
[ INFO] [1502852978.483756639]: Odom: quality=0, std dev=0.000000m|0.000000rad, update time=0.024478s
[ERROR] (2017-08-16 11:09:38.520) RegistrationVis.cpp:1156::computeTransformationImpl() Calibrated camera required (multi-cameras not supported). Id=7 Models=2 StereoModel=0 weight=0
[ WARN] (2017-08-16 11:09:38.520) OdometryF2M.cpp:384::computeTransform() Unknown registration error
[ INFO] [1502852978.520744431]: Odom: quality=0, std dev=0.000000m|0.000000rad, update time=0.026513s
[ERROR] (2017-08-16 11:09:38.551) RegistrationVis.cpp:1156::computeTransformationImpl() Calibrated camera required (multi-cameras not supported). Id=8 Models=2 StereoModel=0 weight=0
[ WARN] (2017-08-16 11:09:38.551) OdometryF2M.cpp:384::computeTransform() Unknown registration error
[ INFO] [1502852978.552174195]: Odom: quality=0, std dev=0.000000m|0.000000rad, update time=0.024412s
[ WARN] [1502852978.566514912]: Could not get transform from odom to base_link after 0,200000 seconds (for stamp=1502852978,274850)! Error="Lookup would require extrapolation at time 1502852978.274849970, but only time 1502852978.242872167 is in the buffer, when looking up transform from frame [base_link] to frame [odom]. canTransform returned after 0.201923 timeout was 0.2.".
[ WARN] [1502852978.771972712]: Could not get transform from odom to base_link after 0,200000 seconds (for stamp=1502852978,474861)! Error="Lookup would require extrapolation at time 1502852978.474861327, but only time 1502852978.242872167 is in the buffer, when looking up transform from frame [base_link] to frame [odom]. canTransform returned after 0.201879 timeout was 0.2.".
matlabbe commented 7 years ago

The default of parameter Vis/EstimationType has been changed to 1 (2D->3D PnP estimation) in recent versions. Vis/EstimationType=1 cannot be used with multiple cameras. Set explicitly Vis/EstimationType=0 to rgbd_odometry and rtabmap nodes:

<param name="Vis/EstimationType" type="string" value="0"/>

I updated demo_two_kinects.launch with this change.

PS: Sorry I missed the previous message 16 days ago, the error seems the same.

cheers, Mathieu

hildaaw commented 7 years ago

Hi Mathieu,

Thanks a lot! After updating the codes i am able to map 2 cameras without any error.

ghost commented 6 years ago

Hi Mathieu

Can you provide a launch script for 2 orbbec-astra sensors ? i am already try to mix it but without any success.

thanks a lot!

matlabbe commented 6 years ago

Well, did you try demo_two_kinects.launch? by changing freenect_launch to astra_launch instead: https://github.com/introlab/rtabmap_ros/blob/bc4b7a860e41948e77554023c3efd26ced949b50/launch/demo/demo_two_kinects.launch#L7-L16

this instead:

  <include file="$(find astra_launch)/launch/astra.launch">
    <arg name="depth_registration" value="True" />
    <arg name="camera" value="camera1" />
    <arg name="device_id" value="#1" />
  </include>
  <include file="$(find astra_launch)/launch/astra.launch">
    <arg name="depth_registration" value="True" />
    <arg name="camera" value="camera2" />
    <arg name="device_id" value="#2" />
  </include>

If so, can you show the resulting tf tree? and warnings/errors of the terminal?

$ rosrun tf view_frames
$ evince frames.pdf
ghost commented 6 years ago

thanks, now it works with 2 astras!

but now i have other question.. its possible do map only a single person, without any scenery elements to help sift processing ?

i try to scan a single person over a rotating plataform, but without any success (attached image).

i know i can use ICP a methods like Kinfu, but they need a GPU, and i want to lower the price of my final equipment.

[]s

frames.pdf snapshot00

matlabbe commented 6 years ago

Indeed, the default parameters are not set to scan objects moving independently from the scene. You could set the maximum depth of extracted features under "Maximum feature depth." of Visual Registration panel ( or "Vis/MaxDepth" parameter) and also the "Maximum words depth" of Vocabulary panel (or "Kp/MaxDepth" parameter).

When exporting the clouds, you can also set a maximum depth for the generated points to filter the background.

ghost commented 6 years ago

I did exactly what you said in previous sample (i already minded about isolating object from the scene) but it doenst work. the scrrenshot is the best result before the cloud become a completly mess. i think there are not enough inliers. there are someway to overcome this ? even if we lose in details, my interest is to close the mesh.

matlabbe commented 6 years ago

You may use ICP odometry instead. Here is an example using the RTAB-Map standalone version and a Kinect (for Xbox 360): https://youtu.be/es0x0bAdEIs

Parameters used: person_scan.ini.txt Main parameters changed are in Odom, Reg and Icp groups. The input scan from the kinect was downsampled by 4 and limited up to 1.5 meter. RTAB-Map should be built with libpointmatcher to give similar results.

I added a small box behind my neck to add geometry because tracking got lost or had very large drift when seeing only the back of the chair. However, there is still a small drift when the camera is only seeing parallel planes. I was not perfectly immobile while turning the chair either, which can add some errors to the scan.

cheers, Mathieu

screenshot_2018-08-07_16-09-10 screenshot_2018-08-07_16-09-31 screenshot_2018-08-07_16-09-46

ghost commented 6 years ago

wow.. very impressive.. i will try your parameters in our equipment ( https://www.youtube.com/watch?v=0y0jxQCE9Bs ). thank you very much Mathieu. and if you have any ideas to solve the problem without using the small box would be nice, cause we need to evaluate postural and perimeter analysis, and the box could imply problems in perimeter measure (but if we need to put some object between the perimeter lines isn't a problem at all, let me try here).

matlabbe commented 6 years ago

With a person standing, I don't think adding more geometry is required as the body has a lot of distinctive curves. The problem here was really the flat back of the chair, causing more drift or making loosing odometry based on the geometry.

Looking at your equipment, you may don't need SLAM at all if you can know the current rotation of the body (current angle of the platform) accordingly to the cameras. You will then avoid the drift problem when there is not enough geometry (i.e., you can have perfect odometry independently of what the object geometry looks like).

ghost commented 6 years ago

yes, i can get the exactly info about rotation (with arduino or even with a raspberrypi), but how can send this information to RTAB-MAP ? do i need to write a custom kernel driver ? or i can simple write this info in a shared file ? in other words its possible to use RTAB-MAP just to register the cloud points ?

matlabbe commented 6 years ago

You can create a TF broadcaster to publish a fake odometry transform (odom -> base_link), replacing rgbd_odometry node. That transform will change accordingly as the platform is rotating. Set rtabmap node's parameter odom_frame_id to that frame (odom), and set Kp/MaxFeatures to -1 to disable loop closure detection (we don't need it as we assume there is no drift).

Assuming that the camera links are children of a fixed frame called base_link, in rviz the frame base_link will look like turning around odom frame (which would be at the center of the rotating platform), similar to video I posted.

ghost commented 6 years ago

wow.. sure i will do this! thanks Mathieu.. I will post the results here soon.

ghost commented 6 years ago

Hi Mathieu

I finally install an absolute encoder (magnetic) on my rotary table and adjust the turtle sim (script below) to test it as you can see in my videos.

but now i am not sure how to mix up this together.. in other words, how to exactly replace the odom info in my 2 sensor tree...

!/usr/bin/env python

import math import serial import roslib roslib.load_manifest('learning_tf') import rospy

import tf import turtlesim.msg

port = serial.Serial("/dev/ttyUSB0", baudrate=9600, timeout=3.0)

def handle_turtle_pose(msg, turtlename): br = tf.TransformBroadcaster() angle = 115 angle = int(port.readline()) br.sendTransform((msg.x, msg.y, 0),

tf.transformations.quaternion_from_euler(0, 0, msg.theta),

                 tf.transformations.quaternion_from_euler(0, 0, math.radians(angle)),
                 rospy.Time.now(),
                 turtlename,
                 "world")

if name == 'main': rospy.init_node('turtle_tf_broadcaster') turtlename = rospy.get_param('~turtle') rospy.Subscriber('/%s/pose' % turtlename, turtlesim.msg.Pose, handle_turtle_pose, turtlename) rospy.spin()

encoder_turtle_frames.pdf frames_2_astras.pdf

https://youtu.be/2o18tDgwUHw

https://youtu.be/hG3WQNmQXXw

matlabbe commented 6 years ago

What you want is to provide a TF representing the camera moving around the rotary table, not the table accordingly to its center. Here the /odom frame is fixed at the center of the table and the camera (/camera_link) is rotating at a fixed distance d from the center of the table accordingly to angle of your absolute encoder: screenshot_2018-08-30_11-32-40

In reality the table is rotating, but in this setup we change the referential so that the camera appears to rotate around the table.

ghost commented 5 years ago

It finally work! Thank you very much Mathieu! https://youtu.be/OmPS2RmiQK8