raulmur / ORB_SLAM2

Real-Time SLAM for Monocular, Stereo and RGB-D Cameras, with Loop Detection and Relocalization Capabilities
Other
9.26k stars 4.69k forks source link

How to run RGB-D example in SLAM mode? #524

Open salocinx opened 6 years ago

salocinx commented 6 years ago

I successfully compiled ORB SLAM 2 for Windows using Visual Studio 2015 vc14 C/C++ compilers. The examples work fine too.

When running the mono_webcam example, I am able to build my own map/point-cloud and tracking my current position (SLAM mode). However, when running the rgbd_tum example, it does build the following two files: CameraTrajectory.txt and KeyFrameTrajectory.txt, but I am not able to do a live SLAM like the mono_webcam example.

I would like to run the RGB-D example (with Kinect or Orbbec Astra sensor device) in live SLAM mode (like the mono_webcam). Is this possible with the examples included in this repository?

salocinx commented 6 years ago

Problem solved.

I wrote an comprehensive guide how to to compile ORB SLAM 2 with OpenCV 3.4.0 including OpenNI2 in order to use my Orbbec Astra camera as depth sensor. Hopefully someone else stumbling over this thread can use it.

leviskim17 commented 6 years ago

@salocinx Hi, Did you succeed to get your own map and point-cloud? If succeeded, can you share how to do? because I cannot find the commands for the RGBD camera.

salocinx commented 6 years ago

@leviskim17 Hi, yes it works. Here's the code for the RGBD camera. I just altered the code of mono_euro.cc to avoid setting up a new project within the ORB_SLAM2 solution.

#include <iostream>
#include <algorithm>
#include <fstream>
#include <chrono>

#include <opencv2/core/core.hpp>

#include <System.h>

#include <time.h>

using namespace std;

// From http://www.euclideanspace.com/maths/geometry/rotations/conversions/matrixToQuaternion/

int main(int argc, char **argv)
{
    string vocabPath = "../ORBvoc.txt";
    string settingsPath = "../webcam.yaml";
    if (argc == 1) {

    } else if (argc == 2) {
        vocabPath = argv[1];
    } else if (argc == 3) {
        vocabPath = argv[1];
        settingsPath = argv[2];
    } else {
        cerr << endl << "Usage: mono_euroc.exe path_to_vocabulary path_to_settings" << endl;
        return 1;
    }

    // Create SLAM system. It initializes all system threads and gets ready to process frames.
    ORB_SLAM2::System SLAM(vocabPath, settingsPath, ORB_SLAM2::System::RGBD, true);

    cout << endl << "Start processing ..." << endl;

    cv::VideoCapture cap(CV_CAP_OPENNI2);
    cap.set(CV_CAP_OPENNI_IMAGE_GENERATOR_OUTPUT_MODE, CV_CAP_OPENNI_VGA_30HZ);
    cap.set(CV_CAP_OPENNI_DEPTH_GENERATOR_REGISTRATION, 1);

    // From http://stackoverflow.com/questions/19555121/how-to-get-current-timestamp-in-milliseconds-since-1970-just-the-way-java-gets
    __int64 now = std::chrono::duration_cast<std::chrono::milliseconds>(std::chrono::system_clock::now().time_since_epoch()).count();

    bool init = false;
    cv::Mat image, depth, tcw;      // rgb image, depth map, pose output
    while (true) {

        if (cap.grab()) {

            // CV_CAP_OPENNI_BGR_IMAGE -> CV_8UC3
            // CV_CAP_OPENNI_GRAY_IMAGE -> CV_8UC1 [x]

            // CV_CAP_OPENNI_DEPTH_MAP - depth values in mm(CV_16UC1)
            // CV_CAP_OPENNI_POINT_CLOUD_MAP - XYZ in meters(CV_32FC3) [x]
            // CV_CAP_OPENNI_DISPARITY_MAP - disparity in pixels(CV_8UC1)
            // CV_CAP_OPENNI_DISPARITY_MAP_32F - disparity in pixels(CV_32FC1) [x]
            // CV_CAP_OPENNI_VALID_DEPTH_MASK - mask of valid pixels(not ocluded, not shaded etc.) (CV_8UC1)

            cap.retrieve(image, CV_CAP_OPENNI_GRAY_IMAGE);
            cap.retrieve(depth, CV_CAP_OPENNI_POINT_CLOUD_MAP);

        } else {
            cout << "ERROR: Could not grab image data." << endl;
        }

        if (!image.data) {
            cout << "ERROR: RGB not retrieved." << endl;
        }

        if (!depth.data) {
            cout << "ERROR: Depth map not retrieved." << endl;
        }

        if (!image.data || !depth.data) {
            return -1;
        }

        __int64 curNow = std::chrono::duration_cast<std::chrono::milliseconds>(std::chrono::system_clock::now().time_since_epoch()).count();

        // Pass the image to the SLAM system and returns camera pose (empty if tracking fails)
        tcw = SLAM.TrackRGBD(image, depth, curNow/1000.0);

        // This can write each image with its position to a file if you want
        /*if (!Tcw.empty()) {
            cv::Mat Rwc = Tcw.rowRange(0, 3).colRange(0, 3).t();
            cv::Mat twc = -Rwc*Tcw.rowRange(0, 3).col(3);
            std::ostringstream stream;
            //stream << "imgs/" << Rwc.at<float>(0, 0) << " " << Rwc.at<float>(0, 1) << " " << Rwc.at<float>(0, 2) << " " << twc.at<float>(0) << " " <<
            //  Rwc.at<float>(1, 0) << " " << Rwc.at<float>(1, 1) << " " << Rwc.at<float>(1, 2) << " " << twc.at<float>(1) << " " <<
            //Rwc.at<float>(2, 0) << " " << Rwc.at<float>(2, 1) << " " << Rwc.at<float>(2, 2) << " " << twc.at<float>(2) << ".jpg";
            stream << "imgs/" << curNow << ".jpg";
            string fileName = stream.str();
            cv::imwrite(fileName, im);
        } */

        // This will make a third window with the color images, you need to click on this then press any key to quit
        cv::imshow("Image", image);
        cv::imshow("Depth", depth);

        if (cv::waitKey(1) >= 0)
            break;
    }

    // Stop all threads
    SLAM.Shutdown();
    cap.release();

    // Save camera trajectory
    SLAM.SaveKeyFrameTrajectoryTUM("KeyFrameTrajectory.txt");

    return 0;
}
leviskim17 commented 6 years ago

@salocinx Thank you so much. Today, I will try it.

zeryabmoussaoui commented 6 years ago

You should try :

capture.retrieve( depthMat, CV_CAP_OPENNI_DEPTH_MAP );  
capture.retrieve( rgbMat, CV_CAP_OPENNI_BGR_IMAGE );

It works fine to me.

UannaFF commented 5 years ago

Hello, I have a question about the YAML, I don't know how to set them for the orbbec. This is what I have until now.

%YAML:1.0

--------------------------------------------------------------------------------------------

Camera Parameters. Adjust them!

--------------------------------------------------------------------------------------------

Camera calibration and distortion parameters (OpenCV)

Camera.fx: 503.642 Camera.fy: 512.275 Camera.cx: 308.189 Camera.cy: 233.283

Camera.k1: 0.0455 Camera.k2: -0.2733 Camera.p1: -0.00735 Camera.p2: -0.0069 Camera.k3: 0.2504

Camera.width: 640 Camera.height: 480

Camera frames per second

Camera.fps: 30

IR projector baseline times fx (aprox.)

Camera.bf: 41.4

Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)

Camera.RGB: 0

Close/Far threshold. Baseline times.

ThDepth: 50.0

Deptmap values factor

DepthMapFactor: 1000.0

--------------------------------------------------------------------------------------------

ORB Parameters

--------------------------------------------------------------------------------------------

ORB Extractor: Number of features per image

ORBextractor.nFeatures: 1000

ORB Extractor: Scale factor between levels in the scale pyramid

ORBextractor.scaleFactor: 1.2

ORB Extractor: Number of levels in the scale pyramid

ORBextractor.nLevels: 8

I put the calibration values of the rgb camera, and for camera.bf i took the focal length of IR and multiplied it by 0.75(I'm not sure of the baseline of the orbbec, I did a google research and i didn't find it).

salocinx commented 5 years ago

As I understand, the IR projector baseline is the distance between the IR projector and the IR camera of your Astra sensor measured in meters! So for example 5 cm would be 0.05. Then you have to multiply this value with "fx" to get "Camera.bf". An example: baseline=5cm=0.05; camera.fx= 575.0; Camera.bf = 0.05*575.0 = 28.75

I used different methods to get fx, fy, cx, cy of the RGB camera built-in to the Astra sensor. The most useful were:

A.) for monocular cameras, such as the RGB camera in the Astra sensor

B.) for stereo cameras I used ROS Stereo Calibration Package (installed on a Ubuntu 16.04 machine; ROS Kinetic package)

Wish you the best!

haochengd commented 5 years ago

Hi salocinx! Thanks for sharing the document. I am currently testing the ORB slam2 with the Astra camera pro. I successfully compiled the codes and ran the demo. Also successfully ran the Astra sdk samples. But when I tried the live mode, it cannot detect the Astra camera. I have copied all the dll and lib to the folder you mentioned in the document. Do you know how to solve it? Thanks in advance!

salocinx commented 5 years ago

No, unfortunately I can't give you more advise. After I wrote down all the steps, I moved on with a stereo camera instead of the Astra. Later when I tried to run the Astra again, I didn't work for me neither. I didn't check deeper into the problem. I am sorry for not having better news. Hope it works for you soon!

Mechazo11 commented 2 weeks ago

Hi @salocinx I am also trying to get RGB-D SLAM mode working with ORB SLAM 3 framework and was interested in reading up your tutorial. But the link is broken?

Could you take a look into it?

With best, @Mechazo11