Closed mhaboali closed 4 years ago
Did you also checked the noise parameters of the sensors?
I would also check your extrinsic transformation along with camera intrinsics. It is hard to help based on just this picture. If you can post your launch file and also how you get the information from realsense-ros and their values we might be able to help more. Also the system can't handle being still at the moment, so it will fly away if you just hold it still and it doesn't have any good SLAM features.
Thanks for your reply!
@WoosikLee2510 I don't know how to get it from realsense-ros. I only get the intrinsic parameters from info topics and get the extrinsic parameters from /camera/extrinsics/depth_to_infra1
What is required is the camera to imu transformation.
On Wed, May 13, 2020 at 12:52 PM Mohamed Hassan notifications@github.com wrote:
Thanks for your reply!
@WoosikLee2510 https://github.com/WoosikLee2510 I don't know how to get it from realsense-ros. I only get the intrinsic parameters from info topics and get the extrinsic parameters from /camera/extrinsics/depth_to_infra1
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/rpng/open_vins/issues/64#issuecomment-628116027, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAQ6TYWI2TQ7FR2KBPNVPYDRRLF6TANCNFSM4M7K462A .
Hi @goldbattle thanks for your reply
Here are my launch files:
rs-wrapper.launch:
<launch>
<arg name="offline" default="false"/>
<include unless="$(arg offline)"
file="$(find realsense2_camera)/launch/rs_camera.launch">
<arg name="align_depth" value="true"/>
<arg name="linear_accel_cov" value="1.0"/>
<arg name="unite_imu_method" value="linear_interpolation"/>
</include>
<node pkg="imu_filter_madgwick" type="imu_filter_node" name="ImuFilter">
<param name="use_mag" type="bool" value="false" />
<param name="publish_tf" type="bool" value="true" />
<param name="world_frame" type="string" value="enu" />
<param name="fixed_frame" type="string" value="camera_link" />
<remap from="/imu/data_raw" to="/camera/imu"/>
</node>
</launch>
open-vins_d435i.launch
<launch>
<!-- MASTER NODE! -->
<!-- <node name="run_serial_msckf" pkg="ov_msckf" type="run_serial_msckf" output="screen" clear_params="true" required="true"> -->
<!-- <node name="run_serial_msckf" pkg="ov_msckf" type="run_serial_msckf" output="screen" clear_params="true" required="true" launch-prefix="gdb -ex run --args">-->
<node name="run_subscribe_msckf" pkg="ov_msckf" type="run_subscribe_msckf" output="screen" clear_params="true" required="true">
<!-- bag topics -->
<param name="topic_imu" type="string" value="/imu/data" />
<param name="topic_camera0" type="string" value="/camera/infra1/image_rect_raw" />
<param name="topic_camera1" type="string" value="/camera/infra2/image_rect_raw" />
<param name="topic_camera2" type="string" value="/camera/color/image_raw" />
<!-- bag parameters -->
<param name="path_bag" type="string" value="" />
<!-- <param name="path_gt" type="string" value="$(find ov_data)/tum_vi/dataset-room1_512_16.csv" />-->
<param name="bag_start" type="int" value="-1" />
<param name="bag_durr" type="int" value="-1" />
<!-- world/filter parameters -->
<param name="use_fej" type="bool" value="true" />
<param name="use_imuavg" type="bool" value="true" />
<param name="use_rk4int" type="bool" value="true" />
<param name="use_stereo" type="bool" value="true" />
<param name="calib_cam_extrinsics" type="bool" value="true" />
<param name="calib_cam_intrinsics" type="bool" value="true" />
<param name="calib_cam_timeoffset" type="bool" value="true" />
<param name="calib_camimu_dt" type="double" value="0.0" />
<param name="max_clones" type="int" value="11" />
<param name="max_slam" type="int" value="30" />
<param name="max_slam_in_update" type="int" value="999" /> <!-- 15 seems to work well -->
<param name="max_msckf_in_update" type="int" value="999" />
<param name="max_cameras" type="int" value="2" />
<param name="dt_slam_delay" type="double" value="3" />
<param name="init_window_time" type="double" value="1.0" />
<param name="init_imu_thresh" type="double" value="0.5" />
<rosparam param="gravity">[0.0,0.0,9.80766]</rosparam>
<param name="feat_rep_msckf" type="string" value="GLOBAL_3D" />
<param name="feat_rep_slam" type="string" value="ANCHORED_FULL_INVERSE_DEPTH" />
<param name="feat_rep_aruco" type="string" value="ANCHORED_FULL_INVERSE_DEPTH" />
<!-- timing statistics recording -->
<param name="record_timing_information" type="bool" value="false" />
<param name="record_timing_filepath" type="string" value="/tmp/ov_msckf_timing.txt" />
<!-- tracker/extractor properties -->
<param name="use_klt" type="bool" value="true" />
<param name="num_pts" type="int" value="200" />
<param name="fast_threshold" type="int" value="15" />
<param name="grid_x" type="int" value="5" />
<param name="grid_y" type="int" value="5" />
<param name="min_px_dist" type="int" value="5" />
<param name="knn_ratio" type="double" value="0.65" />
<param name="downsample_cameras" type="bool" value="false" />
<!-- aruco tag/mapping properties -->
<param name="use_aruco" type="bool" value="false" />
<param name="num_aruco" type="int" value="1024" />
<param name="downsize_aruco" type="bool" value="true" />
<!-- sensor noise values / update -->
<param name="up_msckf_sigma_px" type="double" value="1" />
<param name="up_msckf_chi2_multipler" type="double" value="1" />
<param name="up_slam_sigma_px" type="double" value="1" />
<param name="up_slam_chi2_multipler" type="double" value="1" />
<param name="up_aruco_sigma_px" type="double" value="1" />
<param name="up_aruco_chi2_multipler" type="double" value="1" />
<param name="gyroscope_noise_density" type="double" value="0.00016" />
<param name="gyroscope_random_walk" type="double" value="0.000022" />
<param name="accelerometer_noise_density" type="double" value="0.0028" />
<param name="accelerometer_random_walk" type="double" value="0.00086" />
<!-- camera intrinsics -->
<rosparam param="cam0_wh">[640, 480]</rosparam>
<rosparam param="cam1_wh">[640, 480]</rosparam>
<rosparam param="cam2_wh">[640, 480]</rosparam>
<param name="cam0_is_fisheye" type="bool" value="false" />
<param name="cam1_is_fisheye" type="bool" value="false" />
<param name="cam2_is_fisheye" type="bool" value="false" />
<rosparam param="cam0_k">[391.30157470703125, 321.2224426269531, 391.30157470703125, 240.260498046875]</rosparam>
<rosparam param="cam0_d">[0, 0, 0, 0]</rosparam>
<rosparam param="cam1_k">[391.30157470703125, 321.2224426269531, 391.30157470703125, 240.260498046875]</rosparam>
<rosparam param="cam1_d">[0, 0, 0, 0]</rosparam>
<rosparam param="cam2_k">[190.615.3250122070312, 323.4080505371094, 615.437255859375, 239.9214324951172]</rosparam>
<rosparam param="cam2_d">[0, 0, 0, 0]</rosparam>
<!-- camera extrinsics -->
<rosparam param="T_C0toI">
[
1, 0, 0, 0,
0, 1, 0, 0,
0, 0, 1, 0,
0, 0, 0, 1
]
</rosparam>
<rosparam param="T_C1toI">
[
1, 0, 0, -0.04988466203212738,
0, 1, 0, 0,
0, 0, 1, 0,
0, 0, 0, 1
]
</rosparam>
<rosparam param="T_C2toI">
[
0.9998327493667603, 0.018136249855160713, 0.002364603104069829, 0.014831705950200558,
-0.018144769594073296, 0.9998287558555603, 0.003632423933595419, 0.0003024951438419521,
-0.002298319712281227, -0.003674721345305443, 0.9999905824661255, 0.00010320414730813354,
0.0, 0.0, 0.0, 1.0
]
</rosparam>
</node>
<node type="rviz" name="rviz" pkg="rviz" args="-d $(find ov_msckf)/launch/display.rviz" />
</launch>
What is required is the camera to imu transformation. … On Wed, May 13, 2020 at 12:52 PM Mohamed Hassan @.***> wrote: Thanks for your reply! @WoosikLee2510 https://github.com/WoosikLee2510 I don't know how to get it from realsense-ros. I only get the intrinsic parameters from info topics and get the extrinsic parameters from /camera/extrinsics/depth_to_infra1 — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#64 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAQ6TYWI2TQ7FR2KBPNVPYDRRLF6TANCNFSM4M7K462A .
I don't know how to configure that, could you give me more information about that?
I'm sorry I'm very new to this field and I'm trying to understand things and get it running.
Your help is really appreciated!
What @goldbattle means is that you need to know IMU to camera transformation to perform the update. I think the naive way to get it is: see the imu and camera tf from rostopics, and compute transformation.
So I am not sure about the specifics of that sensor, but the IMU frame is where the IMU sensor is physically mounted on the unit. The camera frame is where the camera sensor is physically mounted on the unit. You can try to calibrate your sensor using Kalibr and see what type of result you get and compare it to what you have (we have some notes on that here)
Here is an example for the T265: https://github.com/IntelRealSense/realsense-ros/issues/912
The driver should build a TF tree and you can try to echo it to compare to what you have: http://wiki.ros.org/tf/Debugging%20tools#tf_echo
@mhaboali you wont be able to use the D435i as a plug an play. the left and right cameras are IR cameras which wont give you the visual features. Also you would need to make changes to remove portions of the measuring model as images you get from the D435i are rectified on board.
@goldbattle Thanks for the information, so I have to calibrate the my D435i to get the K, D, and Extrinsics of it, and that should fix this issue, right?
Yes, but as @NicoNorena mentioned above there will be issues with using the IR cameras (I would recommend trying with RGB camera calibration to the IMU). Using the IR images depends on if features are able to be extracted, which might still be the case depending on if there is enough intensity in those images. You can always try running the "test_tracking" on a bag that you have collected to see if you are able to track reasonable features on the IR cameras.
My recommendation would be to try to calibrate the system, and then see how that compares to what you have posted in the launch file above. This should at least give a hit to where the issue might lie. Hope this helps.
@mhaboali you wont be able to use the D435i as a plug an play. the left and right cameras are IR cameras which wont give you the visual features. Also you would need to make changes to remove portions of the measuring model as images you get from the D435i are rectified on board.
@NicoNorena Thanks for your reply, so do you think
And I didn't get what you mean from this "Also you would need to make changes to remove portions of the measuring model as images you get from the D435i are rectified on board." Could you explain more?
Many thanks!
Yes, but as @NicoNorena mentioned above there will be issues with using the IR cameras (I would recommend trying with RGB camera calibration to the IMU). Using the IR images depends on if features are able to be extracted, which might still be the case depending on if there is enough intensity in those images. You can always try running the "test_tracking" on a bag that you have collected to see if you are able to track reasonable features on the IR cameras.
My recommendation would be to try to calibrate the system, and then see how that compares to what you have posted in the launch file above. This should at least give a hit to where the issue might lie. Hope this helps.
Thanks for the explanation, actually I have T265 here as well, do you think it's better to use it directly to avoid wasting my time with D435i?
Sure you could always give that a go. Our group has used those to collect datasets and the T265 seems to work well. Bellow are the launch files (remove the txt extension) that I have used for both openvins and realsense-ros which importantly requests a "combined" IMU topic. You can give these a try and see how it works (might save you some time on calibration since we perform online calibration and there are not that many large differences between devices).
So glad to receive such valuable information, I'm giving it a try now and I'll keep you posted.
Thank you so much for your great help!
Hi @goldbattle
I've just given it a try and it seems to work perfectly and it almost closes the loop but it has a little offset when it's returned back to the starting point as you can see here
The strange pieces I found:
We just keep a sliding window of poses, and temporal features, and thus no "map" of features. This is an odometry method, which means there is no loop closure logic built into this repository. If you want to limit drift, you can increase the number of features, or use aruco tags which will bound estimator error. I would try to increase the number of SLAM features (to around 75) and see how that works for you (see eth launch file for example params that work on those datasets). Typically we aim for less than 1% trajectory error (i.e. error when returned to the start / total trajectory distance).
On Thu, May 14, 2020 at 6:22 PM Mohamed Hassan notifications@github.com wrote:
Hi @goldbattle https://github.com/goldbattle
I've just given it a try and it seems to work perfectly and it almost closes the loop but it has a little offset when it's returned back to the starting point as you can see here [image: image] https://user-images.githubusercontent.com/29764281/81991473-99776200-9641-11ea-954e-568e4491cc1b.png
The strange pieces I found:
- Why it didn't keep the point cloud data?
- Do you think I have to do my own calibration to resolve that loop-closure issue?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/rpng/open_vins/issues/64#issuecomment-628916804, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAQ6TYVGUC5CIVPPRHLSKK3RRRVL3ANCNFSM4M7K462A .
@goldbattle I tried to change max_slam to 75 and there was an offset between the initial position and the returned position after a very little movement of the camera. I've not tried aruco_tags enable yet.
What do you think?
@mhaboali you wont be able to use the D435i as a plug an play. the left and right cameras are IR cameras which wont give you the visual features. Also you would need to make changes to remove portions of the measuring model as images you get from the D435i are rectified on board.
@NicoNorena Thanks for your reply, so do you think
- Do I have to turn off the emitter of the camera?
- Should I apply the calibration process for my camera or the current values would work?
And I didn't get what you mean from this "Also you would need to make changes to remove portions of the measuring model as images you get from the D435i are rectified on board." Could you explain more?
Many thanks!
@mhaboali what do you mean about emitter? you mean the depth sensor. It does not matter as you wont be using it. openVINS is an odometry implementation which the D435i has its own method on board. I think that the reason for using the D435i is to avoid having to build your own rig, the down side is that you don't get the raw data from the sensor. The data you get, has already being rectified (corrected from the normal sensor noise). The T265 is even more of black box, you can only get odometry information. There are other repositories which you can explore for the mapping given the data from these cameras.
@NicoNorena Thanks for sending me back, and I'm sorry for the late reply,
Emitter_Enabled: is a parameter that turns on the IR projector if it's set, and I thought I had to turn it off to make the input stereo images smother. And regarding the mapping, I'm looking for some lightweight 3D or even 2D mapping pkg, please if you have some recommendations, let me know it'd be very helpful for me.
Ok, thanks for letting me know that, I'll do it using Kalibr when I get a chance.
Thanks so much for your help!
Please feel free to open another issue if you have any more questions or issues that arise, thanks.
@goldbattle First of all thank you for sharing such a great platform for VIO. I have two Intel T265 Cameras and I have tried running openVINS with both of them with factory calibrated intrinsics and extrinsics but the output was not stable. But when I input outputs of the calibration with Kalibr and Kalibr-allan to openVINS the pose output was stable. Do you have an idea on why the factory calibration data is not compatible with OpenVINS?
And imu intrinsc values of my calibration (around 2.5 Hours of data processed by kalibr-allan) is smaller thatn what can be found on your openVINS launch file. How did you aquire them? Are under estimeted? if so why?
Sure you could always give that a go. Our group has used those to collect datasets and the T265 seems to work well. Bellow are the launch files (remove the txt extension) that I have used for both openvins and realsense-ros which importantly requests a "combined" IMU topic. You can give these a try and see how it works (might save you some time on calibration since we perform online calibration and there are not that many large differences between devices).
I have never investigated what they provide as the default calibration. I would recommend just directly comparing the two and seeing if an orientation or position is flipped as compared to the Kalibr result.
We normally inflate the IMU noises, and actually typically unless the IMU is very poor just the default ETH noises can provide reasonable trajectories.
Thanks for sharing this great project!
When I tried it with RealSense D435i using my own launch file that's similar to pgeneva_ros_eth.launch, it worked but while moving it very little movement I saw it moved crazy in RVIZ and didn't stop at all as shown in the attached
Note: I didn't change anything except for topics names and paths and I used my own camera parameters that are published from realsense-ros