rpng / MINS

An efficient and robust multisensor-aided inertial navigation system with online calibration that is capable of fusing IMU, camera, LiDAR, GPS/GNSS, and wheel sensors. Use cases: VINS/VIO, GPS-INS, LINS/LIO, multi-sensor fusion for localization and mapping (SLAM). This repository also provides multi-sensor simulation and data.
GNU General Public License v3.0
427 stars 74 forks source link

how to implement with own hardware platform ? #14

Closed mahammadirfan closed 4 months ago

mahammadirfan commented 9 months ago

Hi, Thanks for great work. I am wondering if i can use this MINS algorithm with the hardware like ouster, stereo camera etc ? is there any detailed documentation to follow ? Thanks in advance!

WoosikLee2510 commented 8 months ago

Hi, @mahammadirfan. Thanks for having an interest in MINS! I will try to document some exemplary cases and tutorials on how to use MINS. Meantime, can you tell me more about your system, please?

mahammadirfan commented 8 months ago

@WoosikLee2510 Thanks for your reply, I am working with ouster lidar os1 with IMU, stereo camera, gps and want to fuse all the sensor like you have done in mins ...my use case is with a drone. Thanks!!

mahammadirfan commented 8 months ago

@WoosikLee2510 Hey, I am trying to use the real world dataset example (urban30), but the ground truth i got one for this is in csv format, so i converted it to txt, Now i am getting some error ?? do you have a ground truth txt file for urban 30 dataset ? could you please share ? how to convert the csv to txt file ? Thanks in advance!

mahammadirfan commented 8 months ago

Screenshot from 2024-01-10 14-07-21

mahammadirfan commented 7 months ago

@WoosikLee2510 Hi I am trying to use the kaist dataset utban 30 ....but getting an error when running the mins_slam... 1) I have downloaded the kaist dataset and followed the steps from kaist2bag https://github.com/rpng/kaist2bag 3) converted and got the bag file of 31.8 gb after merging all the individual bags. 4) when i am running the example: roslaunch mins serial.launch config:=kaist/kaist_LC path_gt:=urban30.txt path_bag:=urban30.bag it gives error .....[IW-Init] smooth fail -nan > 0.10703491 Duration: 0.301214 / 1267.945918

Looking forward to hear from you. Thanks Screenshot from 2024-01-11 17-55-01

WoosikLee2510 commented 7 months ago

@WoosikLee2510 Thanks for your reply, I am working with ouster lidar os1 with IMU, stereo camera, gps and want to fuse all the sensor like you have done in mins ...my use case is with a drone. Thanks!!

Hi- Sorry for my late reply, I was away from my desk for a while and now I am back. For Urban30 gt file, try using this. urban30.txt

If you need more gt files, let me know.

WoosikLee2510 commented 7 months ago

Screenshot from 2024-01-10 14-07-21

The error says the system tried to initialize from ground truth but it did not exist. You can either use the gt file I provided or disable initialization from gt option

WoosikLee2510 commented 7 months ago

@WoosikLee2510 Hi I am trying to use the kaist dataset utban 30 ....but getting an error when running the mins_slam...

  1. I have downloaded the kaist dataset and followed the steps from kaist2bag https://github.com/rpng/kaist2bag
  2. converted and got the bag file of 31.8 gb after merging all the individual bags.
  3. when i am running the example: roslaunch mins serial.launch config:=kaist/kaist_LC path_gt:=urban30.txt path_bag:=urban30.bag it gives error .....[IW-Init] smooth fail -nan > 0.10703491 Duration: 0.301214 / 1267.945918

Looking forward to hear from you. Thanks Screenshot from 2024-01-11 17-55-01

Hum... This is strange. Basically, the error is saying that some of the IMU reading were invalid (NaN). I can tell it from -nan > some number. Could you check the IMU readings and see if any of them contains NaN?

wolf943134497 commented 7 months ago

WHEEL] Avg res/std: 2e-05/6e-04 | R std: 2e-04, Chi 100 %, mean: 8e-03, std 3e-02, # meas: 94974 1544678006.962 (1267.516s) | q_GtoI = 0.010,0.011,0.063,0.998 | p_IinG = 231.373,136.946,-108.109 | dist = 5199.62 (m) Avg processing time(Curr Intr 3:15 Hz | Avg Intr 3:13 Hz): IMU: 0ms WHL: 0ms Total: 0ms / 72ms (75.3X) Hz average: IMU 100.0 WHL 100.0

[WHEEL] Avg res/std: 2e-05/6e-04 | R std: 2e-04, Chi 100 %, mean: 8e-03, std 3e-02, # meas: 94980 1544678007.029 (1267.582s) | q_GtoI = 0.010,0.011,0.063,0.998 | p_IinG = 231.768,136.892,-108.120 | dist = 5199.99 (m) Avg processing time(Curr Intr 3:15 Hz | Avg Intr 3:13 Hz): IMU: 0ms WHL: 0ms Total: 0ms / 72ms (75.3X) Hz average: IMU 100.0 WHL 100.0

[WHEEL] Avg res/std: 2e-05/6e-04 | R std: 2e-04, Chi 100 %, mean: 8e-03, std 3e-02, # meas: 94986 1544678007.095 (1267.649s) | q_GtoI = 0.010,0.012,0.063,0.998 | p_IinG = 232.147,136.884,-108.128 | dist = 5200.35 (m) Avg processing time(Curr Intr 3:15 Hz | Avg Intr 3:13 Hz): IMU: 0ms WHL: 0ms Total: 0ms / 72ms (75.3X) Hz average: IMU 100.0 WHL 100.0

[WHEEL] Avg res/std: 2e-05/6e-04 | R std: 2e-04, Chi 100 %, mean: 8e-03, std 3e-02, # meas: 94992 1544678007.162 (1267.716s) | q_GtoI = 0.010,0.012,0.063,0.998 | p_IinG = 232.466,137.014,-108.140 | dist = 5200.71 (m) Avg processing time(Curr Intr 3:15 Hz | Avg Intr 3:13 Hz): IMU: 0ms WHL: 0ms Total: 0ms / 72ms (75.3X) Hz average: IMU 100.0 WHL 100.0

[WHEEL] Avg res/std: 2e-05/6e-04 | R std: 2e-04, Chi 100 %, mean: 8e-03, std 3e-02, # meas: 94998 1544678007.229 (1267.782s) | q_GtoI = 0.009,0.012,0.062,0.998 | p_IinG = 232.767,137.167,-108.151 | dist = 5201.07 (m) Avg processing time(Curr Intr 3:15 Hz | Avg Intr 3:13 Hz): IMU: 0ms WHL: 0ms Total: 0ms / 72ms (75.3X) Hz average: IMU 100.0 WHL 100.0

[WHEEL] Avg res/std: 2e-05/6e-04 | R std: 2e-04, Chi 100 %, mean: 8e-03, std 3e-02, # meas: 95004 1544678007.295 (1267.849s) | q_GtoI = 0.009,0.012,0.062,0.998 | p_IinG = 233.078,137.311,-108.160 | dist = 5201.43 (m) Avg processing time(Curr Intr 3:15 Hz | Avg Intr 3:13 Hz): IMU: 0ms WHL: 0ms Total: 0ms / 72ms (75.3X) Hz average: IMU 100.0 WHL 100.0

========Final Status======== Total procesing time: 15s Total traveling time: 1267s

================================================================================REQUIRED process [mins_bag-2] has died! process has finished cleanly log file: /home/gw/.ros/log/b0944ab4-bc3c-11ee-b37c-4d6550c907d9/mins_bag-2*.log Initiating shutdown!

[live_align_trajectory-3] killing on exit [mins_bag-2] killing on exit [rosout-1] killing on exit [master] killing on exit shutting down processing monitor... ... shutting down processing monitor complete done

Hi @WoosikLee2510 I use the newest code, I also encountered similar problem. I test imu + wheel and imu + wheel + gps , the test data is kaist urban30. Can you provide detailed test steps for kaist dataset? How to get the fused pose that combined multi sensor update? Thanks

zhh2005757 commented 7 months ago

I also want to use MINS in my own platform. The fusion of camera, imu and wheel odometer worked but lidar fusion has some problem. My lidar is velodyne 16 and I use the parameters of kaist dataset in MINS, but the trajectory has large drift even cannot maintain normal shape. Could your provide some detail about lidar parameters? Thanks!

mahammadirfan commented 7 months ago

@WoosikLee2510 Thanks for your reply, Could you please explain or provide some details docs/steps for the calibration things like....which calibration tools should we use...i am using kalibr for stereo-imu....for cam-imu-lidar which tool should i use ?

Thanks and looking forward to hearing from you.

tejasps28 commented 7 months ago

I also want to use MINS in my own platform. The fusion of camera, imu and wheel odometer worked but lidar fusion has some problem. My lidar is velodyne 16 and I use the parameters of kaist dataset in MINS, but the trajectory has large drift even cannot maintain normal shape. Could your provide some detail about lidar parameters? Thanks!

@zhh2005757 Hi, i am new to VINS, and I want to perform camera imu and wheel fusion. Can you please share how did you do the same ? Was it using MINS? Any information would be really helpful!! Thanks in advance

lnexenl commented 7 months ago

@WoosikLee2510 Thanks for your reply, Could you please explain or provide some details docs/steps for the calibration things like....which calibration tools should we use...i am using kalibr for stereo-imu....for cam-imu-lidar which tool should i use ?

Thanks and looking forward to hearing from you.

I have runned MINS on some private datasets (Mono cam+Lidar+IMU+INS), but we calibrate sensors with our private tool. However, there are no good open-sourced cam-imu-lidar calibration tools for use.

There are some Lidar calibration tools you can use. (You may need do some coordinate transform)

  1. Lidar-Camera calibration direct_visual_lidar_calibration
  2. Lidar-IMU Calibration lidar_IMU_calib

These tools use target-less algorithms, thus you don't need a calibration board.

zhh2005757 commented 7 months ago

I also want to use MINS in my own platform. The fusion of camera, imu and wheel odometer worked but lidar fusion has some problem. My lidar is velodyne 16 and I use the parameters of kaist dataset in MINS, but the trajectory has large drift even cannot maintain normal shape. Could your provide some detail about lidar parameters? Thanks!

@zhh2005757 Hi, i am new to VINS, and I want to perform camera imu and wheel fusion. Can you please share how did you do the same ? Was it using MINS? Any information would be really helpful!! Thanks in advance

Yes, I performed camera, imu and wheel odometer well by MINS in my private dataset. Of course good calibration (camera intrinsics and camera imu wheel extrinsics) is the first before testing MINS. In addition that, since the wheel odometer measurements of my dataset are linear velocities and angular velocities, I changed the type as "Wheel3DCen" in config_wheel.yaml and modified the topic subscription interface to ensure MINS to receive the wheel odometer measurements. Then MINS can work well in my dataset.

Also, I have performed camera-imu-wheel fusion in other frameworks like VINS-Mono. There are many corresponding open-source code that you can easily find.

wolf943134497 commented 7 months ago

@zhh2005757 hi, could you share your modified code or project of MINS?
Have you successfully tested other sensor combinations using dataset similar to kaist? like imu + wheel, imu + wheel + gps, imu + wheel + lidar or Camera-GPS-LiDAR-wheel-IMU . Thanks!

mahammadirfan commented 7 months ago

@zhh2005757 Hey thanks for the reply....would be able to share your modified files for reference. Thanks!

mahammadirfan commented 7 months ago

@WoosikLee2510 Hey, I just checked and run the with the simulation example and real world dataset of euros_mav ...v1-03-difficult.....where I dont see any loopclosure for the SLAM ? Whats the for loop closure not happening ? or this SLAM system doesn't support or do the loop closure ?

Thanks and looking forward to hearing from you!

WoosikLee2510 commented 6 months ago

@WoosikLee2510 Hey, I just checked and run the with the simulation example and real world dataset of euros_mav ...v1-03-difficult.....where I dont see any loopclosure for the SLAM ? Whats the for loop closure not happening ? or this SLAM system doesn't support or do the loop closure ?

Thanks and looking forward to hearing from you!

Hi, @mahammadirfan. The current MINS does not support global visual loop closure (i.e., large correction of trajectory based on visual landmarks by revisiting the same place). However, you can input the loopclosure information (e.g., running another loopclosure detection thread) and use it to update MINS.

mahammadirfan commented 4 months ago

@WoosikLee2510 Hi, Thanks for your reply again. I have a quick question.... when a run mins we get these topics /mins/gps0/path /mins/gps0/pose /mins/imu/odom /mins/imu/path /mins/imu/pose

So imu /path would be the final fusion path and if i fuse cam + lidar + imu +gps ? and gps0/path is the only gps position path ? Am i correct ? or gps0/path is the final fused path including gps ?

Thanks in advance!

WoosikLee2510 commented 4 months ago

The gps pose is the latest gps measurement received, and path is the collection of measurements.

Imu pose is the latest imu pose estimated (with update of other sensors), and the path is the collection of the imu pose. So yes, you ate right.