Open bjornph opened 9 years ago
Hi bjornph, did you succesfully tested the LSD-SLAM with the TUM RGB-D Benchmark dataset? I'm trying to do it, but the software doesn't initialize well from the first images therefore it loses the track very soon.
I had to stop working on it to try something else for some time. However I've looked at it for the last couple of days. Have you gotten it to work?
I made it work good on some and not on others. Which datasets are you trying to run? The ORB-SLAM paper has a nice overview of it in Table III.
My tests:
My test setup: To record the needed parameters, I use:
Notes/questions: fr2_desk: The dataset is compressed. I don't know if this affects the performance. You can decompress it, try "rosbag decompress -h" for info. The camera_info included is the kinect standard ones, from http://vision.in.tum.de/data/datasets/rgbd-dataset/file_formats#intrinsic_camera_calibration_of_the_kinect. I have also tried to replace them with fr2 parameters, but it does not give good results. I find their use of calib files confusing, so when I try to undistort it I run it through image_proc.
Hope this helps some. Keep me updated if you get any breakthrough. I promise it won't take as long to answer next time.
Thanks for your reply. I'm trying it with almost all the datasets, but often the tracking is lost very soon unless I set a large value for KFUsageWeight and KFDistWeight threshold. However this involve a very poor quality of the map. Which value did you use for the keyframe threshold? Futhermore when the algorithm complete the sequence and I evaluate with the online tool the ATE the keyframe poses and the ground truth haven't the same "shape" and the error is at least 1 meter. To gather the keyframe poses I use the same command you wrote so I don't understand why this happen.
For the calibration I'm using the ROS default parameters for all the datasets, because are the raccomanded ones on the TUM website.
Can you be so kind to give me the script to calculate the scale? I've tried to write it but without success.
@Gingol
Hope this helps
[1] https://github.com/bjornph/lsd_files [2] http://www.mathworks.com/matlabcentral/fileexchange/26186-absolute-orientation-horn-s-method [3] http://se.mathworks.com/help/matlab/matlab_external/install-the-matlab-engine-for-python.html#responsive_offcanvas
@bjornph Please excuse me for taking so long to answer. I was quite busy in these days so I tried your code only yesterday... and it works! Thanks a lot you've been very helpful. I have one question left: in the LSD-SLAM paper when they tried the TUM benchmark dataset they used the depth image for the initialization. Did you do the same? Thanks again
So nice that it worked, some questions?
Your question: I am not sure what you mean, but if it's this part you are referring to: "For comparison we show respective results from semi-dense mono-VO [9], keypoint-based mono-SLAM [15], direct RGB-D SLAM [14] and keypoint- based RGB-D SLAM [7]. Note that [14] and [7] use depth information from the sensor, while the others do not.", then it says that RGB-D slam uses depth information, and the others, including LSD-SLAM, does not.
May I ask a question? why I run rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt get: ERROR: Cannot load message class for [lsd_slam_viewer/keyframeMsg]. Are your messages built? thanks. @bjornph
@weichnn I have not looked at this in quite some time, so I am not sure. I would ask this question in the lsd_slam github page. My guess is that you don't load the correct reps in your bash file.
Good luck
In project readme: Instead, this is solved in LSD-SLAM by publishing keyframes and their poses separately: keyframeGraphMsg contains the updated pose of each keyframe, nothing else. keyframeMsg contains one frame with it's pose, and - if it is a keyframe - it's points in the form of a depth map.
so, I use topic /lsd_slam/graph now. thanks. @bjornph
@bjornph Sorry, did you do some pre-processing to the ground truth data or the keyframeTrayectory that ORB SLAM provides you before use the "online evaluation" ? i got very bad results with the freiburg1 sequences
@dawei22 I do not recall the specific details of my test. What kind of results are you getting?
Hi @bjornph, when I do the first steps to record data from the different topics you mentioned I get weird time data. the lsd_to_readable script works fine but then I get an error related to associate.py:
Traceback (most recent call last):
File "associate.py", line 117, in
Do you have any idea how to solve this ? how did you manage to get time data in your example approximately the same as in ground truth ?
thanks !!
Hello someone, I am trying to do first steps: 1 To record keyframes I use "rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt" and "rostopic echo -p /lsd_slam/keyframes/time > lsd_time.txt".
My files camToWorld.txt and time.txt are empties
PARAMETERS
haidara@haidara-virtual-machine:~/catkin_ws$ rosrun lsd_slam_core dataset _files:='/home/haidara/Downloads/fr1_rgb_calibration' _hz:=0 _calib:='/home/haidara/Downloads/fr1_rgb_calibration/cameraCalibration.cfg' Reading Calibration from file /home/haidara/Downloads/fr1_rgb_calibration/cameraCalibration.cfg ... found! found ATAN camera model, building rectifier. Input resolution: 640 480 In: 0.262383 -0.953104 -0.005358 0.002628 1.163314 Out: 0.262383 -0.953104 -0.005358 0.002628 1.163314 Output resolution: 640 480 Prepped Warp matrices Started mapping thread! Started constraint search thread! Started optimization thread found 68 image files in folder /home/haidara/Downloads/fr1_rgb_calibration! failed to load image /home/haidara/Downloads/fr1_rgb_calibration/cameraCalibration.cfg! skipping. failed to load image /home/haidara/Downloads/fr1_rgb_calibration/cameraCalibration.cfg~! skipping. Doing Random initialization! started image display thread! Done Random initialization! warning: reciprocal tracking on new frame failed badly, added odometry edge (Hacky). TRACKING LOST for frame 22 (0.47% good Points, which is 52.07% of available points, DIVERGED)! failed to load image /home/haidara/Downloads/fr1_rgb_calibration/ost.txt! skipping. Finalizing Graph... finding final constraints!! Optizing Full Map! Done optizing Full Map! Added 0 constraints. Finalizing Graph... optimizing!! doing final optimization iteration! Finalizing Graph... publishing!! Done Finalizing Graph.!! ... waiting for SlamSystem's threads to exit Exited mapping thread Exited constraint search thread Exited optimization thread DONE waiting for SlamSystem's threads to exit waiting for image display thread to end! ended image display thread! done waiting for image display thread to end! haidara@haidara-virtual-machine:~/catkin_ws$
haidara@haidara-virtual-machine:~$ rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt
Can someone please help me out on this one. Thanks in advance.
@bjornph
Hello bjorn, I'm also trying to use TUM online tool to evaluate LSD_SLAM and followed your instructions from the previous commentaries. But, I run into a problem. My lsd_time.txt is being generated with wrong information, or at least not in the format that TUM uses to compare. Comparing with your file my parameter "field" on lsd_time are wrong, i get values like 1.79999876022 while you get values like 1341845820.99. So, as my lsd_time is at the wrong format TUM tool can't find any timestamp correspondences. I also run your scripts, and I get error "Index exceeds matrix dimensions" due to the values from lsd_time, because I run with your files and everything worked fine..
Do you know why i'm getting these wrong values in lsd_time?
Thanks
I am testing ORB-SLAM (and LSD-SLAM) on the TUM RGB-D Benchmark dataset, and I have some question on your test process:
ORB questions:
LSD-SLAM questions:
Thank you