aau-cns / mars_lib

MaRS: A Modular and Robust Sensor-Fusion Framework
Other
258 stars 34 forks source link

Pose and IMU fusion #3

Closed antithing closed 1 year ago

antithing commented 1 year ago

Hi, and thank you for making this code available.

I need to add IMU data into existing pose data (xyz / xyzw) from a SLAM system.

The goal is to smooth and predict data when tracking is lost.

Is this code suitable for that purpose? Is there an example that might help me get started?

Thanks!

Chris-Bee commented 1 year ago

Hello @antithing,

yes the MaRS framework is very suitable for this purpose.

You can find a ROS node here: https://github.com/aau-cns/mars_ros The described "pose_node" will be the most suitable for this.

If you do not want to use ROS, you can use the MaRS cpp library directly, and feed the information via CSV files. A reader for pose data can be found here: https://github.com/aau-cns/mars_lib/blob/main/source/mars/include/mars/data_utils/read_pose_data.h

An example application that reads and processes the data (Including pose) can be found here: https://github.com/aau-cns/mars_lib/tree/chb/insane_dataset/source/examples/mars_insane_dataset

A tutorial on how to modify code for different sensor modalities can be found here: https://github.com/aau-cns/mars_lib#stand-alone-usage-and-middleware-integration

Should you need more information on how the theory and idea for the modularity works, here is the publication as a reference: https://www.aau.at/wp-content/uploads/2020/11/MaRS_A_Modular_and_Robust_Sensor-Fusion_Framework.pdf

I hope this helps.

antithing commented 1 year ago

Thank you! That is very helpful.

Would it be possible to share the data files used? eg:

imu_file_name: "px4_imu.csv" pose1_file_name: "pose1.csv"

So that i can reverse engineer the format?

Chris-Bee commented 1 year ago

Of course, but you don't have to reverse engineer the format. The format of all the sensor csv files is described in the README here: https://github.com/aau-cns/mars_lib#csv-file-formats-for-sensor-data

There are test csv files for the end to end tests of the framewok here: https://github.com/aau-cns/mars_lib/tree/development/source/tests

And if you need real world data, I suggest that you have a look at the INSANE data set which we published recently. It has indoor and outdoor multi-sensor data, including image data which might be interesting for the case you described. The csv files are compatible with MaRS. Data: https://www.aau.at/en/smart-systems-technologies/control-of-networked-systems/datasets/insane-dataset/ Publication: https://arxiv.org/abs/2210.09114

antithing commented 1 year ago

Thank you! i will dig into this and attempt getting my data in. Thanks!

Chris-Bee commented 1 year ago

Sounds good, feel free to let us know how it went.

antithing commented 1 year ago

Hello again @Chris-Bee ! I am now looking at integration with a real-time sensor (the Stereolabs Zed2 camera). For initial testing, I am calculating the pose of the camera in relation to an aruco marker, and using that as input, along with the IMU data. I am a little stuck with the data input, would you be able to take a look at what i am doing?

I am setting up state estimation like so:


    std::vector<double> imu_n_w;
    std::vector<double> imu_n_bw;
    std::vector<double> imu_n_a;
    std::vector<double> imu_n_ba;

    // setup propagation sensor
    std::shared_ptr<mars::ImuSensorClass> imu_sensor_sptr = std::make_shared<mars::ImuSensorClass>("IMU");

    // setup the core definition
    std::shared_ptr<mars::CoreState> core_states_sptr = std::make_shared<mars::CoreState>();
    core_states_sptr.get()->set_propagation_sensor(imu_sensor_sptr);
    core_states_sptr.get()->set_noise_std(Eigen::Vector3d(imu_n_w.data()), Eigen::Vector3d(imu_n_bw.data()),
        Eigen::Vector3d(imu_n_a.data()), Eigen::Vector3d(imu_n_ba.data()));

    // setup additional sensors
    // Pose sensor
    std::shared_ptr<mars::PoseSensorClass> pose_sensor_sptr =
        std::make_shared<mars::PoseSensorClass>("Pose", core_states_sptr);
    pose_sensor_sptr->const_ref_to_nav_ =
        true;  // TODO is set here for now but will be managed by core logic in later versions

    // Define measurement noise
    Eigen::Matrix<double, 6, 1> pose_meas_std;
    pose_meas_std << 0.02, 0.02, 0.02, 2 * (M_PI / 180), 2 * (M_PI / 180), 2 * (M_PI / 180);
    pose_sensor_sptr->R_ = pose_meas_std.cwiseProduct(pose_meas_std);

    // Define initial calibration and covariance
    mars::PoseSensorData pose_init_cal;
    pose_init_cal.state_.p_ip_ = Eigen::Vector3d::Zero();
    pose_init_cal.state_.q_ip_ = Eigen::Quaterniond::Identity();

    // The covariance should enclose the initialization with a 3 Sigma bound
    Eigen::Matrix<double, 6, 1> std;
    std << 0.1, 0.1, 0.1, (10 * M_PI / 180), (10 * M_PI / 180), (10 * M_PI / 180);
    pose_init_cal.sensor_cov_ = std.cwiseProduct(std).asDiagonal();

    pose_sensor_sptr->set_initial_calib(std::make_shared<mars::PoseSensorData>(pose_init_cal));

    // create the CoreLogic and link the core states
    mars::CoreLogic core_logic(core_states_sptr);

I then have two threads, one for IMU and one for video->pose.

The IMU thread is:


 std::thread sensThread([&]() {
        while (!exiting) {
            // Using sl::TIME_REFERENCE::CURRENT decouples this from Camera.grab() so we get full 400Hz sampling rate
            if (zed.getSensorsData(sensors_data, sl::TIME_REFERENCE::CURRENT) == sl::ERROR_CODE::SUCCESS) {
                constexpr double D2R = M_PI / 180.0; // Zed 2 uses degrees, convert to radians
                double timestamp = (double)sensors_data.imu.timestamp.getNanoseconds() / 1e9;
                if (timestamp != previousImuTimestamp) {
                    auto angular_velocity = sensors_data.imu.angular_velocity;
                    auto linear_acceleration = sensors_data.imu.linear_acceleration;
                    gyro = Eigen::Vector3d(angular_velocity.x * D2R, angular_velocity.y * D2R, angular_velocity.z * D2R);
                    accel = Eigen::Vector3d(linear_acceleration.x, linear_acceleration.y, linear_acceleration.z);

                    //grab and store
                    GyroscopeDatum g;
                    g.t = timestamp;
                    g.v = gyro;
                    AccelerometerDatum a;
                    a.t = timestamp;
                    a.v = accel;

                        if (!bMarsInit)
                        {
                            Eigen::Vector3d p_wi_init(0, 0, 5);
                            Eigen::Quaterniond q_wi_init = Eigen::Quaterniond::Identity();
                            core_logic.Initialize(p_wi_init, q_wi_init);
                        }

                        core_logic.ProcessMeasurement(imu_sensor_sptr, timestamp, k.data_);

                        mars::BufferEntryType latest_result;
                        core_logic.buffer_.get_latest_state(&latest_result);
                        mars::CoreStateType last_state = static_cast<mars::CoreType*>(latest_result.data_.core_.get())->state_;
                    //    ofile_core << last_state.to_csv_string(latest_result.timestamp_.get_seconds()) << std::endl;

                    previousImuTimestamp = timestamp;
                }

            }
        }
        });

and the video->pose is:

std::thread videoThread([&]() {
        while (!exiting) {
            cv::Mat draw;
            if (zed.grab() == sl::ERROR_CODE::SUCCESS) {
                double timestamp = (double)zed.getTimestamp(sl::TIME_REFERENCE::IMAGE).getNanoseconds() / 1e9;
                zed.retrieveImage(image_zed, sl::VIEW::LEFT_GRAY, sl::MEM::CPU);
                zed.retrieveImage(image_zedR, sl::VIEW::RIGHT_GRAY, sl::MEM::CPU);
                zed.retrieveImage(image_zedD, sl::VIEW::DEPTH, sl::MEM::CPU);

                //get depth for the user point clicking

                //  int64 t0 = cv::getTickCount();

                CameraDatum c;
                c.t = timestamp;
                c.images.push_back(image_ocv.clone());
                c.images.push_back(image_ocvR.clone());
                c.depth = image_ocvD.clone();

                std::chrono::milliseconds ms = std::chrono::duration_cast<std::chrono::milliseconds>(
                    std::chrono::system_clock::now().time_since_epoch());
                c.ms = ms;

                cv::cvtColor(image_ocv, draw, cv::COLOR_GRAY2RGB);

                cv::aruco::detectMarkers(draw, dictionary, markerCorners, markerIds, parameters, rejectedCandidates);             

                std::vector<cv::Vec3d> rvecs, tvecs;
                cv::aruco::estimatePoseSingleMarkers(markerCorners, 0.05, K, D, rvecs, tvecs);

                if (rvecs.size() == 1)
                {
                    cv::Mat R;
                    cv::Rodrigues(rvecs[0], R); 

                    R = R.t();  // calculate camera R matrix

                    cv::Mat camRvec;
                    Rodrigues(R, camRvec); // calculate camera rvec

                    cv::Mat camTvec = -R * tvecs[0]; // calculate camera translation vector

                    core_logic.ProcessMeasurement(pose_sensor_sptr, timestamp, k.data_);

                    // Repropagation after an out of order update can cause the latest state to be different from the current update
                    // sensor. Using get_latest_sensor_handle_state is the safest option.
                    mars::BufferEntryType latest_result;
                    core_logic.buffer_.get_latest_sensor_handle_state(pose_sensor_sptr, &latest_result);
                    mars::PoseSensorStateType last_state = pose_sensor_sptr->get_state(latest_result.data_.sensor_);
                    //  ofile_pose << last_state.to_csv_string(latest_result.timestamp_.get_seconds()) << std::endl;
                }

                             bNewFrame = true;

            }

        }

        });

For both the core_logic.ProcessMeasurement(pose_sensor_sptr, timestamp, k.data_); lines, what should I convert to in order to pass data to the state estimator?

also, for the IMU biases, what exactly do I pass here:

  std::vector<double> imu_n_w;
  std::vector<double> imu_n_bw;
  std::vector<double> imu_n_a;
  std::vector<double> imu_n_ba;

I have access to:


Accelerometer       

linear_acceleration Acceleration force applied on all three physical axes (x, y, and z), including the force of gravity. Values are corrected from bias, scale and misalignment.    m/s2
linear_acceleration_uncalibrated    Acceleration force applied on all three physical axes (x, y, and z), including the force of gravity. Values are uncalibrated.   m/s2
linear_acceleration_covariance  Measurement noise of the uncalibrated linear acceleration of the accelerometer. Provided as a 3x3 covariance matrix.    

Gyroscope       
angular_velocity    Rate of rotation around each of the three physical axes (x, y, and z). Values are corrected from bias, scale and misalignment.  deg/s
angular_velocity_uncalibrated   Rate of rotation around each of the three physical axes (x, y, and z). Values are uncalibrated. deg/s
angular_velocity_covariance Measurement noise of the uncalibrated angular velocity of the gyroscope. Provided as a 3x3 covariance matrix.   

Orientation     
pose_covariance Measurement noise of the pose orientation. Provided as a 3x3 covariance matrix. 
camera_imu_transform    Transform between IMU and Left Camera frames.
imu_pose   Orientation fused

Thank you!

Chris-Bee commented 1 year ago

@antithing the process measurement line that you cited: core_logic.ProcessMeasurement(pose_sensor_sptr, timestamp, k.data_); receives a 'BufferDataType',

bool CoreLogic::ProcessMeasurement(std::shared_ptr<SensorAbsClass> sensor, const Time& timestamp,
                                   const BufferDataType& data)

Thus the data that you need to pass is a constructed BufferDataType with your measurement data, the associated sensor pointer and the timestamp. You can find an example on how to build a BufferDataType for a pose sensor here.

The IMU noise inforamtion is passed as continous noise, the discritization, based on the IMU timestamps is done by MaRS.

Let me know if this helps.

antithing commented 1 year ago

Hi @Chris-Bee sorry to open this again. I have the code running, but I am seeing strange results.

My setup is:


//mars lib stuff:
    std::vector<double> imu_n_w{ 0.013, 0.013, 0.013 };
    std::vector<double> imu_n_bw{ 0.0013, 0.0013, 0.0013 };
    std::vector<double> imu_n_a{ 0.083, 0.083, 0.083 };
    std::vector<double> imu_n_ba{ 0.0083, 0.0083, 0.0083 };

    // setup propagation sensor
    std::shared_ptr<mars::ImuSensorClass> imu_sensor_sptr = std::make_shared<mars::ImuSensorClass>("IMU");

    // setup the core definition
    std::shared_ptr<mars::CoreState> core_states_sptr = std::make_shared<mars::CoreState>();
    core_states_sptr.get()->set_propagation_sensor(imu_sensor_sptr);
    core_states_sptr.get()->set_noise_std(Eigen::Vector3d(imu_n_w.data()), Eigen::Vector3d(imu_n_bw.data()),
        Eigen::Vector3d(imu_n_a.data()), Eigen::Vector3d(imu_n_ba.data()));

    // setup additional sensors
    // Pose sensor
    std::shared_ptr<mars::PoseSensorClass> pose_sensor_sptr = std::make_shared<mars::PoseSensorClass>("Pose", core_states_sptr);
    pose_sensor_sptr->const_ref_to_nav_ = true;  // TODO is set here for now but will be managed by core logic in later versions

    // Define measurement noise
    Eigen::Matrix<double, 6, 1> pose_meas_std;
    pose_meas_std << 0.02, 0.02, 0.02, 2 * (M_PI / 180), 2 * (M_PI / 180), 2 * (M_PI / 180);
    pose_sensor_sptr->R_ = pose_meas_std.cwiseProduct(pose_meas_std);

    // Define initial calibration and covariance
    mars::PoseSensorData pose_init_cal;
    pose_init_cal.state_.p_ip_ = Eigen::Vector3d::Zero();
    pose_init_cal.state_.q_ip_ = Eigen::Quaterniond::Identity();

    // The covariance should enclose the initialization with a 3 Sigma bound
    Eigen::Matrix<double, 6, 1> std;
    std << 0.1, 0.1, 0.1, (10 * M_PI / 180), (10 * M_PI / 180), (10 * M_PI / 180);
    pose_init_cal.sensor_cov_ = std.cwiseProduct(std).asDiagonal();

    pose_sensor_sptr->set_initial_calib(std::make_shared<mars::PoseSensorData>(pose_init_cal));

    // create the CoreLogic and link the core states
    mars::CoreLogic core_logic(core_states_sptr);

Then I am running a live camera and grabbing the camera pose from an aruco marker pnp solve.

II also grab the imu data, and run both through the filter.

 sl::SensorsData sensors_data;
    SensorsData::IMUData imu_data;

    mars::BufferDataType dataPose;
    bool frame = false;
    bool dataGot = false;
    double previousImuTimestamp = -1;
    // double previousVideoTimestamp = -1;
    double previousPoseTimestamp = -1;

    std::thread sensThread([&]() {
        while (!exiting) {
            // Using sl::TIME_REFERENCE::CURRENT decouples this from Camera.grab() so we get full 400Hz sampling rate
            if (zed.getSensorsData(sensors_data, sl::TIME_REFERENCE::CURRENT) == sl::ERROR_CODE::SUCCESS) {
                constexpr double D2R = M_PI / 180.0; // Zed 2 uses degrees, convert to radians
                double timestamp = (double)sensors_data.imu.timestamp.getNanoseconds() / 1e9;
                if (timestamp != previousImuTimestamp) {
                    auto angular_velocity = sensors_data.imu.angular_velocity;
                    auto linear_acceleration = sensors_data.imu.linear_acceleration;
                    gyro = Eigen::Vector3d(angular_velocity.x * D2R, angular_velocity.y * D2R, angular_velocity.z * D2R);
                    accel = Eigen::Vector3d(linear_acceleration.x, linear_acceleration.y, linear_acceleration.z);

                    imu_data = sensors_data.imu;
                    sl::Transform imuPose = imu_data.pose;

                    //grab and store
                    GyroscopeDatum g;
                    g.t = timestamp;
                    g.v = gyro;
                    AccelerometerDatum a;
                    a.t = timestamp;
                    a.v = accel;

                        Eigen::Vector3d _acceleration(accel.x(), accel.y(), accel.z());
                        Eigen::Vector3d _velocity(gyro.x(), gyro.y(), gyro.z());

                        mars::BufferDataType data;
                        data.set_sensor_data(std::make_shared<mars::IMUMeasurementType>(_acceleration, _velocity));

                        core_logic.ProcessMeasurement(imu_sensor_sptr, timestamp, data);

                        if (frame == true)
                        {
                            mars::BufferDataType dataP;

                            ReadLock w_lock(grabberLock);
                            dataP = dataPose;
                            w_lock.unlock();

                            frame = false;

                            core_logic.ProcessMeasurement(pose_sensor_sptr, timestamp, dataP);

                            dataGot = true;

                        }

                        if (!core_logic.core_is_initialized_)
                        {
                            // Initialize the first time at which the propagation sensor occures
                            if (imu_sensor_sptr == core_logic.core_states_->propagation_sensor_)
                            {
                                Eigen::Vector3d p_wi_init(0, 0, 5);
                                Eigen::Quaterniond q_wi_init = Eigen::Quaterniond::Identity();
                                core_logic.Initialize(p_wi_init, q_wi_init);
                            }
                            else
                            {
                                continue;
                            }
                        }

                        if (dataGot)
                        {
                            // Repropagation after an out of order update can cause the latest state to be different from the current update
                            // sensor. Using get_latest_sensor_handle_state is the safest option.
                            mars::BufferEntryType latest_result;
                            core_logic.buffer_.get_latest_sensor_handle_state(pose_sensor_sptr, &latest_result);
                            mars::PoseSensorStateType last_state = pose_sensor_sptr->get_state(latest_result.data_.sensor_);
                            std::cout << last_state.p_ip_.x() <<  " FILTER " << std::endl;

                        }

                    previousImuTimestamp = timestamp;
                }

            }
        }
        });

    double lastFrameRatePublish = -1;
    constexpr double PUBLISH_INTERVAL = 1.0;
    constexpr size_t WINDOW_FRAMES = 100;

    std::thread videoThread([&]() {
        while (!exiting) {
            cv::Mat draw;
            if (zed.grab() == sl::ERROR_CODE::SUCCESS) {
                double timestamp = (double)zed.getTimestamp(sl::TIME_REFERENCE::IMAGE).getNanoseconds() / 1e9;

                zed.retrieveImage(image_zed, sl::VIEW::LEFT_GRAY, sl::MEM::CPU);
                zed.retrieveImage(image_zedR, sl::VIEW::RIGHT_GRAY, sl::MEM::CPU);
                zed.retrieveImage(image_zedD, sl::VIEW::DEPTH, sl::MEM::CPU);

                CameraDatum c;
                c.t = timestamp;
                c.images.push_back(image_ocv.clone());
                c.images.push_back(image_ocvR.clone());
                c.depth = image_ocvD.clone();

                std::chrono::milliseconds ms = std::chrono::duration_cast<std::chrono::milliseconds>(
                    std::chrono::system_clock::now().time_since_epoch());
                c.ms = ms;

                cv::cvtColor(image_ocv, draw, cv::COLOR_GRAY2RGB);

                cv::aruco::detectMarkers(draw, dictionary, markerCorners, markerIds, parameters, rejectedCandidates);             

                std::vector<cv::Vec3d> rvecs, tvecs;
                cv::aruco::estimatePoseSingleMarkers(markerCorners, 0.05, K, D, rvecs, tvecs);

                if (rvecs.size() > 0)
                {
                    cv::Mat R;
                    cv::Rodrigues(rvecs[0], R); 

                    R = R.t();  // calculate camera R matrix

                  //  cv::Mat camRvec;
                 //   Rodrigues(R, camRvec); // calculate camera rvec

                    cv::Mat camTvec = -R * tvecs[0]; // calculate camera translation vector
                    cv::Vec3d tvec(camTvec);
                    Eigen::Matrix3d mat;
                    cv2eigen(R, mat);
                    Eigen::Quaterniond EigenQuat(mat);

                    Eigen::Vector3d position(tvec(0), tvec(1), tvec(2));
                    Eigen::Quaterniond orientation(EigenQuat.w(), EigenQuat.x(), EigenQuat.y(), EigenQuat.z());
                    std::cout << position.x() << " POSE " << std::endl;
                    mars::BufferDataType data;
                    data.set_sensor_data(std::make_shared<mars::PoseMeasurementType>(position, orientation));

                    WriteLock w_lock(grabberLock);
                    dataPose = data;
                    w_lock.unlock();

                    frame = true;                 
                }

                for (int i = 0; i < markerIds.size(); i++)
                {
                    cv::aruco::drawDetectedMarkers(draw, markerCorners, markerIds);
                    cv::drawFrameAxes(draw, K, D, rvecs[i], tvecs[i], 0.1);
                }

                cv::imshow("marker", draw);
                cv::waitKey(1);

                frameNumber++;               

                bNewFrame = true;

            }

        }

        });

With a static camera, i would expect the pose output from the pnp solve, and the output from the state estimation to be the same, but what i see when i print translation.x from the above code is:

0.0355853220 POSE //aruco camera pose
0.2006233225 FILTER //mars::PoseSensorStateType last_state.p_ip_.x()
0.2006233225 FILTER
0.2006233225 FILTER
0.2006233225 FILTER
0.2006233225 FILTER
0.2006233225 FILTER
0.2006233225 FILTER
0.2006233225 FILTER
0.2006233225 FILTER
0.2006233225 FILTER
0.2006233225 FILTER
0.2006233225 FILTER
0.2006233225 FILTER
0.0384234485 POSE

I am attaching the full code in case you have time to look at it. :) Thank you very much! main.zip