Open dimaxano opened 4 years ago
@dimaxano what do you mean by bad ? does it loose feature often ? or no tracking at all ? what is your stereo baseline ? did you specify all the params correctly ?
Hi! When tracking is good (mono), in my case, pointclound of features is very similar to the real environment, tracking is stable, estimated trajectory is very similar to the real one. In bad (stereo) case, obtained pointcloud if feature look like a mess i.e when robot sees a plane and feature point should be on the approximately same depth, their depth very differs from each other. Also very rarely tracking lasts more than 5-7 seconds and there are problems with initialization: most of the time slam is trying to initialize.
Baseline is about 14cm. Possible source of the problem could be camera calibration I think, checking now this option by feeding stream (rectified) from ZED camera.
Can you please elaborate more which parameters do you mean?
In stereo, you have to give your right and left camera intrinsic parameters that you got from your camera calibration. You can refer one of the example config file that they give to you in openvslam repo.
Then you have to change the following parameter
Camera.focal_x_baseline:
this one should be your cameras' fx * your baseline (m)
Hi, @Shashika007 Thank you for your reply! As I understand I have to give left and right intrinsic + distortion only if I am using raw (not rectified) images Elsewhere I need only specify intrinsic obtained after stereo rectification Am I right?
Hi! I am using self-made stereo pair on my copter. I recorded some flight videos (via rosbag) and run openvslam (mono) on the left camera video and it works fine. After that I run openvslam stereo on both left and right camera (unrectified, but setting up rectification params in the config) and it works very bad. Can you please advice where can I look for the source of this problem?