aau-cns / mars_lib

MaRS: A Modular and Robust Sensor-Fusion Framework
Other
258 stars 34 forks source link

Problems about world frame #10

Closed brick-ai closed 11 months ago

brick-ai commented 11 months ago

As used in the project and specified in docs, the mars_lib uses ENU as world frame of core states.

Then, if we want to fuse imu with 6DOF poses which use NED as world frame, shouldn't we do some transformations to make them use consistent world frames?

I didn't see any transformation that fixes this kind of problems, like code in https://github.com/aau-cns/mars_lib/blob/18e9a924733efbb97de76328e1713b13153e9634/source/mars/include/mars/sensors/pose/pose_sensor_class.h#L195C1-L196C1

So, How to fuse different sensors which use different world frames?

mascheiber commented 11 months ago

Hi @brick-ai,

Thank you for your question and comments on the frames in use.

Indeed, MaRS is using ENU-frame definitions internally for all its sensors, including the IMU and any pose sensor. It was designed that way, and as you correctly point out, using a NED-based sensor requires a transformation to be applied.

MaRS requires the explicit application (either in your MaRS lib app or in your ROS1 wrapper) to perform the correct transformation, such that the IMU or 6-DoF pose is provided in an ENU-frame definition. Thus, I would suggest transforming your sensors' measurements in the following way (matlab pseudo-code):

R_NED2ENU = [0 1 0;
             1 0 0;
             0 0 -1];
x_ENU = R_NED2ENU * x_NED
R_ENU = R_NED2ENU * R_NED

This could be done before setting the sensor data, e.g., for a 6-DoF pose sensor here: https://github.com/aau-cns/mars_lib/blob/1a245132c9a3633df1c71cd10e1234d2f2ea78a1/source/mars/include/mars/data_utils/read_pose_data.h#L45-L49 such that:

Eigen::Vector3d position(csv_data["p_x"][k], csv_data["p_y"][k], csv_data["p_z"][k]);
Eigen::Quaterniond orientation(csv_data["q_w"][k], csv_data["q_x"][k], csv_data["q_y"][k], csv_data["q_z"][k]);

// convert NED pose to ENU pose
Eigen::Matrix3d R_ned2enu << 0, 1, 0, 1, 0, 0, 0, 0, -1;
Eigen::Vector3d pos_enu = R_ned2enu * position;
Eigen::Quaterniond ori_enu(R_ned2enu * orientation.toRotationMatrix());

BufferDataType data;
data.set_sensor_data(std::make_shared<PoseMeasurementType>(pos_enu, ori_enu));

In case the IMU provides data in a NED-frame definition, you could perform a similar transformation before setting the IMU sensor data, e.g. here https://github.com/aau-cns/mars_lib/blob/1a245132c9a3633df1c71cd10e1234d2f2ea78a1/source/mars/include/mars/data_utils/read_sim_data.h#L69 Though, I recommend double-checking the IMU axis definitions and rotation directions. In the past, I have used IMUs where (within the driver) the linear acceleration was defined using a left-hand NWD-frame, while the angular velocity used a right-hand ENU definition. In such scenarios, additional transformations to the one presented above had to be applied before adding the sensor measurement in MaRS.

For completeness, an example where two sensors with different world frames are used can also be found in the MaRS ROS Dualpose Wrapper https://github.com/aau-cns/mars_ros/blob/6f7d0109569bcf8f2b1e590662035959c10fed20/src/mars_wrapper_dualpose_full.cpp#L647-L649 In this wrapper, though, the conversion rotates (and translates) two ENU frames. Yet, the idea of transforming stays the same, regardless of the underlying frame definition.

I hope this helps, and please let us know if you have any further questions regarding frame transformations!

brick-ai commented 11 months ago

@mascheiber

Thanks!

It really helps! Solved my problems.

Closed.