tum-vision / lsd_slam

LSD-SLAM
GNU General Public License v3.0
2.6k stars 1.23k forks source link

question: how to use live camera #16

Closed amiltonwong closed 9 years ago

amiltonwong commented 9 years ago

Dear all,

I had installed lsd_slam package successfully and followed the Quicksteps smoothly.

Now, I attached a camera to experiment the live setting. First, I initiate the camera using command: 'rosrun uvc_cam uvc_cam_node device:=/dev/video1' Then I run commands: 'rosrun lsd_slam_viewer viewer' and then, 'rosrun lsd_slam_core live_slam image:=/camera/image_raw camera_info:=/camera_info' (the published topic is /camera/image_raw) But then, nothing happens. I cannot see the live result. Did I miss some steps? I use rqt_graph to see the info of topics and node, here is: rosgraph

Could someone give me some suggestion to fix it? THX~

Best regards, Milton

mhkabir commented 9 years ago

Are you using a monochrome camera? Probably not, so pipe the image_raw through the image_proc node. Start it in the /camera namespace and remap lsd_slam's input to /camera/image_mono

Can you please paste your terminal output here.

amiltonwong commented 9 years ago

@mhkabir, thank you very much.

Yes, my camera is just a Logitech C525 rgb camera, not mono camera. As taking reference from your suggestion, I use image_proc by following command: 'ROS_NAMESPACE=camera rosrun image_proc image_proc ' and then 'rosrun lsd_slam_core live_slam image:=/camera/image_mono camera_info:=/camera/camera_info' 'rosrun lsd_slam_viewer viewer' The DebugWindow DEPTH appears and the image is inside.

The following is terminal output: (Maybe the camera is not calibrated well, so the tracking is lost and I will practice it soon)

root@milton-OptiPlex-760:~# rosrun lsd_slam_core live_slam image:=/camera/image_mono camera_info:=/camera/camera_info WAITING for ROS camera calibration! Received ROS Camera Calibration: fx: 0.000000, fy: 0.000000, cx: 0.000000, cy: 0.000000 @ 640x480 RECEIVED ROS camera calibration! Started mapping thread! Started optimization thread Started constraint search thread! Doing Random initialization! Done Random initialization! TRACKING LOST for frame 2 (0.00% good Points, which is -nan% of available points, DIVERGED)! requested full reset! ... waiting for SlamSystem's threads to exit Exited constraint search thread Exited optimization thread Exited mapping thread DONE waiting for SlamSystem's threads to exit Deleted SlamSystem Object! Started mapping thread! Started constraint search thread! Started optimization thread Doing Random initialization! Done Random initialization! TRACKING LOST for frame 2 (0.00% good Points, which is -nan% of available points, DIVERGED)!

rosgraph2

mhkabir commented 9 years ago

Yep, I'd say that you should calibrate your camera as no calibration values are being send to lsd_slam. Thus you cannot track. All your values are 0 which leads to no tracking. Happy to help :)

amiltonwong commented 9 years ago

@mhkabir , @JakobEngel , For further question, I'm trying to integrate into AR.Drone 2 platform using ardrone_autonomy package. As the image resolution of its front camera is 640x360, which is not multiples of 16 and it will throw error when using it as live input of lsd_slam.

root@milton-PC:~# rosrun lsd_slam_core live_slam image:=/ardrone/image_mono _calib:=/root/Downloads/LSD-SLAM/ardrone_sequence/cameraCalibration.cfg Reading Calibration from file /root/Downloads/LSD-SLAM/ardrone_sequence/cameraCalibration2.cfg ... found! found ATAN camera model, building rectifier. Input resolution: 640 360 In: 0.771557 1.368560 0.552779 0.444056 1.156010 Out: 0.771557 1.368560 0.552779 0.444056 1.156010 Output resolution: 640 360 Prepped Warp matrices image dimensions must be multiples of 16! Please crop your images / video accordingly. Started mapping thread! Started constraint search thread! Started optimization thread Doing Random initialization! Done Random initialization! Segmentation fault (core dumped) root@milton-PC:~#

I think there maybe two ways to tackle this problem:

  1. modify lsd_slam to permit custom resolution input (e.g. 640x360) (Could you answer where I can modify the code?)
  2. pipe the live input through intermediate node such as image_proc node for cropping and resizing . (But so far I cannot figure out how to do it.)

It'll be greatly appreciated if you could give me some suggestions to tackle this problem. THX~

Best regards, Milton

mhkabir commented 9 years ago

@amiltonwong You can use the inbuilt camera cropping system in lsd_slam. I expect you've already calibrated your camera now, you can simply set the output crop size in the config file. Check the README for more info.

Set the last line of the camera config to your desired resolution.

karim-nemra commented 9 years ago

Hello

I am new in ROS,

I am using the LSD-SLAM, for live test I am using .bag file, is it possible to use an .avi video as input.

Does the *.bag file contain only the images?

Thank you very much for your answer

Tgaaly commented 9 years ago

By default the .cfg file is not being read. How do I make it read from a .cfg file in order to crop to a multiple of 16? Where in the code does it check for say pinhole_example_calib.cfg. When I edit this it doesnt work! Please advise.

Tgaaly commented 9 years ago

ok figured it out. You just need to add _calib:=<path to .cfg calib file> as follows:

rosrun lsd_slam_core live_slam /image:=/image_raw _calib:=/pinhole_example_calib.cfg