raulmur / ORB_SLAM2

Real-Time SLAM for Monocular, Stereo and RGB-D Cameras, with Loop Detection and Relocalization Capabilities
Other
9.37k stars 4.69k forks source link

Tracking consistency with inverted video/frame #157

Open RongenC opened 8 years ago

RongenC commented 8 years ago

Hi everyone, I tried to look for any "questions/issues" related to my question but couldn't find any so I post it here.

My first question is when I tried to run ORB_SLAM2 with my video input (stereo video from ZED camera), the results differ by a lot between and standard upright video and inverted video. From my understanding, shouldn't the algorithm would not affected by the image input except the tracking result will be inverted? The reason why I asked this is because when comparing ORB's tracking result with my ground truth which acquired using Optitrack system, the "standard" video able to have similar tracking result but the inverted is totally off by a huge margin. Is anyone here has their setup with an inverted camera setup and has to manually "flip" the image before feeding into the ORB system?

My second question is when comparing ORB_SLAM2 with my ground truth, my video is a one axis (x-axis) motion but when 2-axes rotation (x-axis and y-axis), ORB has mistakenly thought there is a slight y-axis movement in the range on 0.5m (50cm) in an oscillation motion. Have anyone experienced this or it is the nature of ORB_SLAM2?

Thanks a lot for the help,

Rongen

AlejandroSilvestri commented 8 years ago

If I understood you right, when you invert the images, you should swap right and left images.

I didn't follow you on your second question, but in my experience ORB-SLAM2 is quite stable and precise. Remember monocular SLAM mapping can't bear pure rotations.

RongenC commented 8 years ago

Hi AlejandroSilvestri,

Thank you for your response.

I haven't considered to swap the left and right image when the capture is flipped 180degree. I will try that, thanks!

For the second part, I am kinda confused from what you are saying about "monocular SLAM mapping can't bear pure rotations." I am using stereo- ORB SLAM2 and it is still considered as monocular SLAM Mapping? and let me rephrase my question. I have a ground truth which is measured using an Optitrack system with tested accuracy of 0.1mm setup. It tracks the camera motion which is recording the motion for ORB SLAM2. This motion is a one dimensional movement but have a pitch and yaw motion along (2 axes rotation). And using ORB SLAM2, it tracks the movement and rotation (both axes) OK but not great, BUT it introduces additional y-axis movement (which it didn't happen in the true motion).

I know this is not a common practice so if you happen to come across it, it would be great. Otherwise, it would be something for other researchers to beware of.

Cheers and thanks for the help, Rongen