UZ-SLAMLab / ORB_SLAM3

ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial and Multi-Map SLAM
GNU General Public License v3.0
6.66k stars 2.57k forks source link

IMU initalization is too late. #28

Closed HwangDaeHyun closed 4 years ago

HwangDaeHyun commented 4 years ago

Hi , thank you for your work.

We are testing with Intel's T-265 sensor and VI sensor.

Overall these sensors seem to work fine with ORB SLAM3.

The screenshot below is the result of the sensor experiment.

Screenshot from 2020-07-28 20-11-47

Screenshot from 2020-07-30 16-48-51

By the way, when we tested with the T-265 sensor, the initialization is always late with the message'IMU is not initialized. Reset Active Map' (below photo)

Screenshot from 2020-07-30 16-57-23

We have experimented with both the stereo IMU and mono IMU versions of your algorithm. And the results of both are the same. (late initialize with T-265)

Why is initialization so late? How to quickly complete initialization?

biendltb commented 4 years ago

I think that the issue that I posted in #27 is the same as yours. It is not about the process of IMU initialization but there is probably something wrong with the calib parameters or by other reasons that I haven't figured out yet. This makes the number of matches inliers lower than 50 (refer to this line).

I could see the number of points for initializing the map in your case is relatively low. Testing on TUM VI shows me that a new map created in the first frame has 201 points. Since ORB features are matched by using the transformation obtained from IMU, wrong camera calibration matrixes or IMU-camera transformation matrix could lead to the low number of tracked points.

You should re-check your calib parameters and please let me know if you found something else.

zhoumingchaoslam commented 4 years ago

Hi , biendltb.

I have the same problem as HwangDaeHyun 。

I use mono IMU interface as follow

using namespace std;

class ImageGrabber
{
public:
    ImageGrabber(ORB_SLAM3::System* pSLAM):mpSLAM(pSLAM){}

    void GrabMonoImu(const sensor_msgs::ImageConstPtr& msg , const sensor_msgs::ImuConstPtr& imu_msg);

    vector<ORB_SLAM3::IMU::Point> vImuMeas;

    ORB_SLAM3::System* mpSLAM;
};

int main(int argc, char **argv)
{
    ros::init(argc, argv, "Mono_inertial");
    ros::start();
    if(argc != 3)
    {
        cerr << endl << "Usage: rosrun ORB_SLAM2 Mono_inertial path_to_vocabulary path_to_settings" << endl;        
        ros::shutdown();
        return 1;
    }    

    // Create SLAM system. It initializes all system threads and gets ready to process frames.
    ORB_SLAM3::System SLAM(argv[1],argv[2],ORB_SLAM3::System::IMU_MONOCULAR,true);
    ImageGrabber igb(&SLAM);

    ros::NodeHandle nodeHandler;

    message_filters::Subscriber<sensor_msgs::Image> img_sub(nodeHandler, "/cam0/image_raw", 1);
    message_filters::Subscriber<sensor_msgs::Imu> imu_sub(nodeHandler, "/imu0", 1);

    typedef message_filters::sync_policies::ApproximateTime<sensor_msgs::Image, sensor_msgs::Imu> sync_pol;
    message_filters::Synchronizer<sync_pol> sync(sync_pol(5), img_sub, imu_sub);
    // sync.registerCallback(boost::bind(&ImageGrabber::GrabMonoImu,&igb,_1,_2));
    sync.registerCallback(boost::bind(&ImageGrabber::GrabMonoImu,&igb,_1,_2));

    ros::spin();

    // Stop all threads
    SLAM.Shutdown();

    // Save camera trajectory
    SLAM.SaveKeyFrameTrajectoryTUM("KeyFrameTrajectory.txt");

    ros::shutdown();

    return 0;
}

void ImageGrabber::GrabMonoImu(const sensor_msgs::ImageConstPtr& msg , const sensor_msgs::ImuConstPtr& imu_msg)
{   
    // Copy the ros image message to cv::Mat.
    cv_bridge::CvImageConstPtr cv_ptr;
    try
    {
        cv_ptr = cv_bridge::toCvShare(msg);
    }
    catch (cv_bridge::Exception& e)
    {
        ROS_ERROR("cv_bridge exception: %s", e.what());
        return;
    }
        float t = imu_msg->header.stamp.toSec();
        float dx = imu_msg->linear_acceleration.x;
        float dy = imu_msg->linear_acceleration.y;
        float dz = imu_msg->linear_acceleration.z;
        float rx = imu_msg->angular_velocity.x;
        float ry = imu_msg->angular_velocity.y;
        float rz = imu_msg->angular_velocity.z;
        vImuMeas.push_back(ORB_SLAM3::IMU::Point(dx, dy, dz, rx, ry, rz, t));

    mpSLAM->TrackMonocular(cv_ptr->image,cv_ptr->header.stamp.toSec(),vImuMeas);
    vImuMeas.clear();
}

Hope you can help,thanks。

ccamposm commented 4 years ago

Hi @HwangDaeHyun. From your terminal screenshot, it seems that map is being initialized with a low number of 3D points (28 and 38 points) and it is immediately lost. This might be caused by a bad calibration. Are intrinsic camera parameters accurately calibrated?

HwangDaeHyun commented 4 years ago

Hello @ccamposm Thanks for your answer.

I fixed this problem.

In my case, IMU parameter was wrong.

I ran ORB-SLAM3 using the T-265's default built-in, external parameters. You can extract these parameters using the command line'rs-enumerate-devices -c'.

And using the motion capture system and the rpg_trajectory_evaluation (https://github.com/uzh-rpg/rpg_trajectory_evaluation) tool, we evaluated the VIO algorithm as shown below.

! [down_trajectory_side] (https://user-images.githubusercontent.com/16742591/90117450-e5958800-dd91-11ea-83e4-5caad0c6ba37.png)

ORB-SLAM3 showed the best performance in RMSE.

thank you.

ccamposm commented 4 years ago

Hi @HwangDaeHyun, We are happy to hear that! Thanks for sharing your results.

PatZhuang commented 4 years ago

Hello @ccamposm Thanks for your answer.

I fixed this problem.

In my case, IMU parameter was wrong.

I ran ORB-SLAM3 using the T-265's default built-in, external parameters. You can extract these parameters using the command line'rs-enumerate-devices -c'.

And using the motion capture system and the rpg_trajectory_evaluation (https://github.com/uzh-rpg/rpg_trajectory_evaluation) tool, we evaluated the VIO algorithm as shown below.

! [down_trajectory_side] (https://user-images.githubusercontent.com/16742591/90117450-e5958800-dd91-11ea-83e4-5caad0c6ba37.png)

ORB-SLAM3 showed the best performance in RMSE.

thank you.

I came to exact same issue, could you please explain what exactly is "IMU parameters"? Are you referring to the IMU settings in yaml file that ORB_SLAM3 uses for "PATH_TO_SETTINGS_FILE" or the calibration settings of acceleration and gyro of the imu itself? And how could I resolve this issue?

zdegrh commented 4 years ago

Hello @ccamposm Thanks for your answer.

I fixed this problem.

In my case, IMU parameter was wrong.

I ran ORB-SLAM3 using the T-265's default built-in, external parameters. You can extract these parameters using the command line'rs-enumerate-devices -c'.

And using the motion capture system and the rpg_trajectory_evaluation (https://github.com/uzh-rpg/rpg_trajectory_evaluation) tool, we evaluated the VIO algorithm as shown below.

! [down_trajectory_side] (https://user-images.githubusercontent.com/16742591/90117450-e5958800-dd91-11ea-83e4-5caad0c6ba37.png)

ORB-SLAM3 showed the best performance in RMSE.

thank you.

I used all the IMU peremeter from "rs-enumerate-devices -c", but it gets wrong track result.

it is my transformation Transformation from body-frame (imu) to camera Tbc: !!opencv-matrix rows: 4 cols: 4 dt: f data: [ -0.999972, 0.00193326, 0.00728582 , 0.0107000041753054, -0.00192006, -0.999996, 0.001819, 1.2732925824821e-11, 0.00728931, 0.00180496, 0.999972, -2.91038304567337e-11, 0.0, 0.0, 0.0, 1.0] IMU noise (Use those from VINS-mono) IMU.NoiseGyro: 0.000005148030141 # rad/s^0.5 IMU.NoiseAcc: 0.000066952452471 # m/s^1.5 IMU.GyroWalk: 0.000000499999999 # rad/s^1.5 IMU.AccWalk: 0.000099999997474 # m/s^2.5 IMU.Frequency: 200

is the peremeter correct? how is your peremeter? thanks!

HwangDaeHyun commented 4 years ago

I think you should take the square root of the IMU data you used.

The data you used are variance, so the values ​​look very small.

When you run the 'rs-enumerate-devices -c' , outputs are varaince of the noise and bias.

zdegrh commented 4 years ago

I think you should take the square root of the IMU data you used.

The data you used are variance, so the values ​​look very small.

When you run the 'rs-enumerate-devices -c' , outputs are varaince of the noise and bias.

thanks, I have tried just now, the result is still wrong, could you please paste the yaml file?

HwangDaeHyun commented 4 years ago

https://gist.github.com/HwangDaeHyun/47bf45c880c3e2fbff1ccfafe7d41f77

Here. I don't know these parameters are right for your sensor. but. in my case doing well.

When the result is still wrong, you should be times 10 or more number to your IMU parameter .

zdegrh commented 4 years ago

I checked your yaml file with mine, they are all most the same, any way, thanks!

https://gist.github.com/HwangDaeHyun/47bf45c880c3e2fbff1ccfafe7d41f77

Here. I don't know these parameters are right for your sensor. but. in my case doing well.

When the result is still wrong, you should be times 10 or more number to your IMU parameter .

the Stereo-Inertial sometimes works, but not stable, sometimes the track drifted, sometimes crashed with error. the Monocular-Inertial always doesn't work.

seungsunkim commented 3 years ago

@HwangDaeHyun Hello. I have a question about the value of the yaml file. I'm using RealSense D455 and I'm using ORB SLAM3's Monocular-inertial. I'm asking because I don't know the meaning of the Tbc value in the yaml file. What I understood is the alignment value from the sensor to imu, is that right? Also, when rs-enumerate-devices.exe-c is performed to determine the value, Matrix produces a 3x3 matrix, which Yaml requires a 4x4 matrix. The price is different, what should I do? This is a picture of the result that I turned around.

image

The picture below shows the Tbc value of the yaml file.

image

timeoptimal commented 5 months ago

Can you please tell how to get the count of the green map points in ROS? I need this to check the quality of localziation