raulmur / ORB_SLAM2

Real-Time SLAM for Monocular, Stereo and RGB-D Cameras, with Loop Detection and Relocalization Capabilities
Other
9.3k stars 4.7k forks source link

the same dataset but rmse test result was different from *png with *bag #398

Open HughLeoWu opened 7 years ago

HughLeoWu commented 7 years ago

Hi all,

as title, I use Orb_slam2 to run TUM dataset "ex: fr2_desk.png or fr2_desk.bag", and then I use Keyframe.txt to compute two ATE values, the test results were different.

My camera configure was: TUM2.yaml -> the same as github

For ROS bag TUM2_rosbag.yaml %YAML:1.0

--------------------------------------------------------------------------------------------

// Camera Parameters. Adjust them!

--------------------------------------------------------------------------------------------

// Camera calibration and distortion parameters (OpenCV) Camera.fx: 525.0 Camera.fy: 525.0 Camera.cx: 319.5 Camera.cy: 239.5

Camera.k1: 0.0 Camera.k2: 0.0 Camera.p1: 0.0 Camera.p2: 0.0 Camera.k3: 0.0

Camera.width: 640 Camera.height: 480

// Camera frames per second Camera.fps: 30.0

// IR projector baseline times fx (aprox.) Camera.bf: 40.0

// Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale) Camera.RGB: 1

// Close/Far threshold. Baseline times. ThDepth: 40.0

// Deptmap values factor DepthMapFactor: 1.031

--------------------------------------------------------------------------------------------

// ORB Parameters

--------------------------------------------------------------------------------------------

// ORB Extractor: Number of features per image ORBextractor.nFeatures: 1000

// ORB Extractor: Scale factor between levels in the scale pyramid
ORBextractor.scaleFactor: 1.2

// ORB Extractor: Number of levels in the scale pyramid
ORBextractor.nLevels: 8

// ORB Extractor: Fast threshold // Image is divided in a grid. At each cell FAST are extracted imposing a minimum response. // Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST // You can lower these values if your images have low contrast
ORBextractor.iniThFAST: 20 ORBextractor.minThFAST: 7

--------------------------------------------------------------------------------------------

the TUM2_rosbag.yaml, I reference the TUM website: http://vision.in.tum.de/data/datasets/rgbd-dataset/file_formats

The *png test result was: compared_pose_pairs 129 pairs absolute_translational_error.rmse 0.009325 m -> just the same as paper absolute_translational_error.mean 0.008768 m absolute_translational_error.median 0.008592 m absolute_translational_error.std 0.003176 m absolute_translational_error.min 0.002468 m absolute_translational_error.max 0.016692 m

The ROS.bag test result was: absolute_translational_error.rmse 0.032545 m absolute_translational_error.mean 0.029833 m absolute_translational_error.median 0.028154 m absolute_translational_error.std 0.013007 m absolute_translational_error.min 0.006497 m absolute_translational_error.max 0.057566 m

Has everybody encounter this problem? If my TUM2_rosbag.yaml's setting was wrong, please let me know. Thanks a lot.

poine commented 7 years ago

Is it possible that when running with the rosbag you are dropping frames ?

On Thu, Jul 27, 2017 at 12:16 PM, HughLeoWu notifications@github.com wrote:

Hi all,

as title, I use Orb_slam2 to run TUM dataset "ex: fr2_desk.png or fr2_desk.bag", and then I use Keyframe.txt to compute two ATE values, the test results were different.

My camera configure was: TUM2.yaml -> the same as github

For ROS bag TUM2_rosbag.yaml %YAML:1.0

-----------------------------------------------------------


Camera Parameters. Adjust them!

-----------------------------------------------------------


Camera calibration and distortion parameters (OpenCV)

Camera.fx: 525.0 Camera.fy: 525.0 Camera.cx: 319.5 Camera.cy: 239.5

Camera.k1: 0.0 Camera.k2: 0.0 Camera.p1: 0.0 Camera.p2: 0.0 Camera.k3: 0.0

Camera.width: 640 Camera.height: 480 Camera frames per second

Camera.fps: 30.0 IR projector baseline times fx (aprox.)

Camera.bf: 40.0 Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)

Camera.RGB: 1 Close/Far threshold. Baseline times.

ThDepth: 40.0 Deptmap values factor

DepthMapFactor: 1.031

-----------------------------------------------------------


ORB Parameters

-----------------------------------------------------------


ORB Extractor: Number of features per image

ORBextractor.nFeatures: 1000 ORB Extractor: Scale factor between levels in the scale pyramid

ORBextractor.scaleFactor: 1.2 ORB Extractor: Number of levels in the scale pyramid

ORBextractor.nLevels: 8 ORB Extractor: Fast threshold Image is divided in a grid. At each cell FAST are extracted imposing a minimum response. Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST You can lower these values if your images have low contrast

ORBextractor.iniThFAST: 20 ORBextractor.minThFAST: 7

-----------------------------------------------------------


the TUM2_rosbag.yaml, I reference the TUM website: http://vision.in.tum.de/data/datasets/rgbd-dataset/file_formats

The *png test result was: compared_pose_pairs 129 pairs absolute_translational_error.rmse 0.009325 m -> just the same as paper absolute_translational_error.mean 0.008768 m absolute_translational_error.median 0.008592 m absolute_translational_error.std 0.003176 m absolute_translational_error.min 0.002468 m absolute_translational_error.max 0.016692 m

The ROS.bag test result was: absolute_translational_error.rmse 0.032545 m absolute_translational_error.mean 0.029833 m absolute_translational_error.median 0.028154 m absolute_translational_error.std 0.013007 m absolute_translational_error.min 0.006497 m absolute_translational_error.max 0.057566 m

Has everybody encounter this problem? If my TUM2_rosbag.yaml's setting was wrong, please let me know. Thanks a lot.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/raulmur/ORB_SLAM2/issues/398, or mute the thread https://github.com/notifications/unsubscribe-auth/AAGTzwjA6qqpTeJkkDLwCMa7DEHyNkgCks5sSGN8gaJpZM4OlE5- .

HughLeoWu commented 7 years ago

@poine Thanks for yours reply, but how can I check this? I try to decrease fps from 30 to 20 Hz, the RMSE test result still about 0.03m.

poine commented 7 years ago

maybe you can play the rosbag in slow speed ( ie rosbag play -r 0.1 foo.bag )

On Thu, Jul 27, 2017 at 5:07 PM, HughLeoWu notifications@github.com wrote:

@poine https://github.com/poine Thanks for yours reply, but how can I check this? I try to decrease fps from 30 to 20 Hz, the RMSE test result still about 0.03m.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/raulmur/ORB_SLAM2/issues/398#issuecomment-318390703, or mute the thread https://github.com/notifications/unsubscribe-auth/AAGTzyIVwH3Sv16HyMH1O4DYik-z7U52ks5sSKevgaJpZM4OlE5- .

HughLeoWu commented 7 years ago

@poine Thanks your response, I try this command, but the test results is:

absolute_translational_error.rmse 0.032588 m absolute_translational_error.mean 0.030511 m absolute_translational_error.median 0.027436 m absolute_translational_error.std 0.011448 m absolute_translational_error.min 0.012866 m absolute_translational_error.max 0.055985 m

it does not enhance. Did you encounter and resolve this problem?

poine commented 7 years ago

you can change the ros example code to read and process images one after another and not drop any. That should in theory yield the same result as when working with PNGs maybe something like:

include <rosbag/bag.h>

include <rosbag/view.h>

include <sensor_msgs/Image.h>

include <cv_bridge/cv_bridge.h>

rosbag::Bag bag; bag.open("blah.bag", rosbag::bagmode::Read); std::vector topics; topics.push_back("/camera/image_raw"); rosbag::View view(bag, rosbag::TopicQuery(topics)); foreach(rosbag::MessageInstance const m, view) { sensor_msgs::Image::ConstPtr imsg = m.instantiate(); double ts = imsg->header.stamp.toSec(); cv_bridge::CvImageConstPtr cv_ptr = cv_bridge::toCvShare(imsg, sensor_msgs::image_encodings::MONO8); cv::Mat pose = slam.TrackMonocular(cv_ptr->image, ts); }

hth Poine

On Fri, Jul 28, 2017 at 5:09 AM, HughLeoWu notifications@github.com wrote:

@poine https://github.com/poine Thanks your response, I try this command, but the test results is:

absolute_translational_error.rmse 0.032588 m absolute_translational_error.mean 0.030511 m absolute_translational_error.median 0.027436 m absolute_translational_error.std 0.011448 m absolute_translational_error.min 0.012866 m absolute_translational_error.max 0.055985 m

it does not enhance. Did you encounter and resolve this problem?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/raulmur/ORB_SLAM2/issues/398#issuecomment-318544960, or mute the thread https://github.com/notifications/unsubscribe-auth/AAGTz4bvsFwKvM3BJWeP7zJdDyRp3y6Dks5sSVEDgaJpZM4OlE5- .

haidela commented 6 years ago

@poine Good evening, ORB-SLAM2 (and LSD-SLAM) on the TUM RGB-D Benchmark dataset?