raulmur / ORB_SLAM2

Real-Time SLAM for Monocular, Stereo and RGB-D Cameras, with Loop Detection and Relocalization Capabilities
Other
9.5k stars 4.71k forks source link

Initial position and orientation for multiple cameras #482

Open parlange opened 7 years ago

parlange commented 7 years ago

Hi @raulmur, I am trying to use Stereo ORB-SLAM2 with DJI Guidance 5 stereo pairs. Currently, a separate ROS node for each camera can be launched, each with its own map.

How can I initialize a single node with the 5 cameras so that they move as a coupled, rigid body? Each one a different position and orientation.

C5: origin, pointing down.

The rest of the cameras have a position relative to the origin (C5) with different orientations. C1: 0º, C2:90º, C3:180º, C4:270º.

                 ____1____                       
                |         | 
                4    5    2 
                |         | 
                 ____3____

Thank you.

AlejandroSilvestri commented 7 years ago

@prlng

ORB-SLAM2 can't do that, it will need a lot of coding effort to bundle the five cameras. It is not a problem of connecting the cameras together, but deciding how should orb-slam2 use each feed.

poine commented 7 years ago

maybe you'd be interested in this: https://arxiv.org/abs/1610.07336

On Tue, Dec 5, 2017 at 9:02 PM, AlejandroSilvestri <notifications@github.com

wrote:

@prlng https://github.com/prlng

ORB-SLAM2 can't do that, it will need a lot of coding effort to bundle the five cameras. It is not a problem of connecting the cameras together, but deciding how should orb-slam2 use each feed.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/raulmur/ORB_SLAM2/issues/482#issuecomment-349423839, or mute the thread https://github.com/notifications/unsubscribe-auth/AAGTz9USkJPmeIIrw73vDEL7j2KpsYw5ks5s9aFKgaJpZM4Qz73X .

AlejandroSilvestri commented 6 years ago

@poine , thank you, marked to read.

BTW, very interesting job with the inverted pendulum.

poine commented 6 years ago

Thanx. Now the ceiling mounted tracker has 3 cameras ( https://www.youtube.com/watch?v=_u4qhHbuV6Q ) and tracks several robots. I can go back to playing with control.

I played a little with multicol earlier this year year. It's a nice piece of code. I got it to work in a gazebo simulation ( https://www.youtube.com/watch?v=IVeUZdJwShU&feature=youtu.be ) . I did some real world experiments with it ( https://goo.gl/photos/C5n7pQDffTb7hZ8B9 https://photos.app.goo.gl/A6J7vL7P9VUzIKW23 ). I had to rewrite some code for the fisheye distortion model calibration - I found the original matlab thingy horrid. Another difficulty is the calibration of the relative pose of the cameras. Stefan Urbste used an optitrack. I did not have access to that, so I did something with aruco markers on the walls of my room inspired by this ( https://arxiv.org/abs/1606.00151 ).

On Wed, Dec 6, 2017 at 12:10 PM, AlejandroSilvestri < notifications@github.com> wrote:

@poine https://github.com/poine , thank you, marked to read.

BTW, very interesting job with the inverted pendulum.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/raulmur/ORB_SLAM2/issues/482#issuecomment-349607967, or mute the thread https://github.com/notifications/unsubscribe-auth/AAGTz5ji2iWFGLKnQDVhIC3D9hAUm8jCks5s9nYKgaJpZM4Qz73X .

poine commented 6 years ago

Those two are more colorful:

https://www.youtube.com/watch?v=rcXm4QCaq64

https://www.youtube.com/watch?v=X8M0IHWhTcs

On Wed, Dec 6, 2017 at 12:52 PM, antoine drouin poinix@gmail.com wrote:

Thanx. Now the ceiling mounted tracker has 3 cameras ( https://www.youtube.com/watch?v=_u4qhHbuV6Q ) and tracks several robots. I can go back to playing with control.

I played a little with multicol earlier this year year. It's a nice piece of code. I got it to work in a gazebo simulation ( https://www.youtube.com/watch?v=IVeUZdJwShU&feature=youtu.be ) . I did some real world experiments with it ( https://goo.gl/photos/ C5n7pQDffTb7hZ8B9 https://photos.app.goo.gl/A6J7vL7P9VUzIKW23 ). I had to rewrite some code for the fisheye distortion model calibration - I found the original matlab thingy horrid. Another difficulty is the calibration of the relative pose of the cameras. Stefan Urbste used an optitrack. I did not have access to that, so I did something with aruco markers on the walls of my room inspired by this ( https://arxiv.org/abs/1606.00151 ).

On Wed, Dec 6, 2017 at 12:10 PM, AlejandroSilvestri < notifications@github.com> wrote:

@poine https://github.com/poine , thank you, marked to read.

BTW, very interesting job with the inverted pendulum.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/raulmur/ORB_SLAM2/issues/482#issuecomment-349607967, or mute the thread https://github.com/notifications/unsubscribe-auth/AAGTz5ji2iWFGLKnQDVhIC3D9hAUm8jCks5s9nYKgaJpZM4Qz73X .

parlange commented 6 years ago

Thank you @AlejandroSilvestri. Is there a way to establish an initial position and orientation for a single camera?

parlange commented 6 years ago

Hi @poine, A couple of weeks ago, I read the MultiCol-SLAM paper and tried to run the examples. I found that Multi-Col does not have the ROS node as the original ORB-SLAM2.

I am also working with Gazebo simulator, a modified version by Technical University of Munich (TUM) called tum_simulator, which has the model or an AR.Drone 2.0 with frontal and bottom cameras. How did manage to use MultiCol-SLAM with Gazebo (and ROS, I guess)?

Excellent work with Rosmip in simulated and real world scenarios. And thanks for sharing! @poine

yuyadanyadan commented 6 years ago

Hi, I use stereo-fisheye to run the project. The results are just so so? The result of aligning line is good! Do you know the reasons? I observed the point cloud is not good? Sometimes, the features match error with them in the right image because of same texture and the blurred images with motion. Waiting for your answer! Thank you very much!

AlejandroSilvestri commented 6 years ago

@prlng ,

Origin in orb-slam2 is set in map initialization. Initialization is done with two frames, the pose of the first frame is taken as origin, as world reference for the map.

You don't know in advance the pose of that initial frame. After initialization you must manage to know the real world pose of that specific frame, and then you can come up with a transformation matrix which transform from map reference to real world reference. That matrix should transform pose and scale.

That is the first step. After a while you can experiment drifting, including scale drifting, and there is no easy way to be aware and compensate it.

parlange commented 6 years ago

@AlejandroSilvestri Thank you very much for your comments on map initialization!

parlange commented 6 years ago

@yuyadanyadan I'm not sure what your problem is, but I would suggest recalibrating your stereo-fisheye by printing a checkerboard and using ROS OpenCV stereo camera_calibration. If you are using simulation (e.g. Gazebo), you could build a model of a checkerboard, and calibrate your camera inside the simulation.

mromanelli9 commented 6 years ago

@prlng did you manage to get it work? I'm using ArUco marker to get (an estimation of) the pose of the camera, but after I cannot make it work in the slam.