Open shubhamwagh opened 5 years ago
I'm having a similar problem and would like to know how I can get fisheye to work with vins fusion
firstly,I found that Vins-Fusion doesnt read fisheye mask,then I added some words in the program,like what Vins-Mono does.Then in my experiment the problem still exists,but I think its the first step of what we should do.
Does Vins-Fusion support fisheye stereo?Any suggestion for using it? @pjrambo Thanks in advance
@ethanguo0327 In the paper, if you read Applications: Feedback control on an aerial robot, they make use of fisheye lens for which they use KANNALA_BRANDT camera model widely used for the fish-eye lens. It will be great if authors can provide more insight into this.
Hello, I'm having the same issue with mynteye s2100 stereo camera (pinhole). Is this problem solved? If not please guide me on how I can overcome this problem. Looking forward to your reply. Thanks.
@shubhamwagh
I encounter the same problem as yours. luckily, I address it at the end.
The key is to calibrate the VIO device yourself, DON'T USE ITS OWN PARAMETERS.
I use imu_utils and kalibr to calibrate T265. The following is my fisheye1.yaml
.
%YAML:1.0
---
model_type: MEI
camera_name: camera
image_width: 848
image_height: 800
mirror_parameters:
xi: 1.6943561
distortion_parameters:
k1: -0.1075293
k2: 0.6081762
p1: 0.0029581
p2: 0.0020715
projection_parameters:
gamma1: 774.927
gamma2: 773.762
u0: 420.086
v0: 402.516
Then running VINS-Fusion.
If you can read some Chinese, you can see my article↓. Realsense T265 calibration and running VINS. Using kalibr and imu_utils
It is great someone finally got VINS fusion working using T265 camera. Thanks, @LuciousZhang for the info.
Hello @shubhamwagh , where you able to use VINS-Fusion with T265?
@LuciousZhang Can you please mention which camera model from Kalibr you used?
@LuciousZhang Can you please mention which camera model from Kalibr you used?
@mzahana MEI
Thanks @LuciousZhang . However I see no model named MEI in Kalibr documentation (https://github.com/ethz-asl/kalibr/wiki/supported-models)
Is it one of the supported models?
@mzahana This VINS-Fusion package does provide MEI camera model. Check code And for calibration using Kalibr, it does provide omnidirectional camera model (omni) (which is MEI)
@shubhamwagh so you are suggesting to use the calibrator in this package?
@mzahana Use either Kalibr to calibrate MEI type camera model which is actually and get the calibration data. And then use VINSfusion as mentioned by @LuciousZhang
or use Vins-Fusion calibration package to calibrate camera...check and then use VINS-fusion stuff.
@shubhamwagh thanks for the explanation.
@LuciousZhang I followed the steps of the calibration using Kalibr and also tried to follow he steps that you wrote in your blog. However, the estimator diverges very fast. My config files looks as follwos,
%YAML:1.0
#common parameters
#support: 1 imu 1 cam; 1 imu 2 cam: 2 cam;
imu: 1
num_of_cam: 2
imu_topic: "/camera/imu"
image0_topic: "/camera/fisheye1/image_raw"
image1_topic: "/camera/fisheye2/image_raw"
output_path: "/home/dji/output/"
cam0_calib: "left.yaml"
cam1_calib: "right.yaml"
image_width: 848
image_height: 800
# Extrinsic parameter between IMU and Camera.
estimate_extrinsic: 1 # 0 Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.
# 1 Have an initial guess about extrinsic parameters. We will optimize around your initial guess.
# Left
body_T_cam0: !!opencv-matrix
rows: 4
cols: 4
dt: d
data: [1.0, 0.0, 0.0, 0.0,
0.0, 1.0, 0.0, 0.0,
0.0, 0.0, 1.0, 0.0,
0, 0, 0, 1]
# Right
body_T_cam1: !!opencv-matrix
rows: 4
cols: 4
dt: d
data: [0.9999607139422159, 0.0022898268013124634, -0.008563134087404545, -0.06387522333501072,
-0.002248740448562943, 0.999985929369192, 0.004804605086086959, 2.259229602903413e-05,
0.008574015312202064, -0.00478516006610509, 0.9999517930903333, 0.00046170207522490083,
0, 0, 0, 1]
#Multiple thread support
multiple_thread: 1
#feature traker paprameters
max_cnt: 150 # max feature number in feature tracking
min_dist: 30 # min distance between two features
freq: 10 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image
F_threshold: 1.0 # ransac threshold (pixel)
show_track: 0 # publish tracking image as topic
flow_back: 1 # perform forward and backward optical flow to improve feature tracking accuracy
#optimization parameters
max_solver_time: 0.04 # max solver itration time (ms), to guarantee real time
max_num_iterations: 8 # max solver itrations, to guarantee real time
keyframe_parallax: 10.0 # keyframe selection threshold (pixel)
#imu parameters The more accurate parameters you provide, the better performance
acc_n: 0.018536 # accelerometer measurement noise standard deviation. #0.2 0.04
gyr_n: 0.00202783 # gyroscope measurement noise standard deviation. #0.05 0.004
acc_w: 0.00061931 #0.001 # accelerometer bias random work noise standard deviation. #0.002
gyr_w: 0.000014786 #0.0001 # gyroscope bias random work noise standard deviation. #4.0e-5
g_norm: 9.805 # gravity magnitude
#unsynchronization parameters
estimate_td: 1 # online estimate time offset between camera and imu
td: 0.003 # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)
#loop closure parameters
load_previous_pose_graph: 0 # load and reuse previous pose graph; load from 'pose_graph_save_path'
pose_graph_save_path: "/home/dji/output/pose_graph/" # save and load path
save_image: 0 # save image in pose graph for visualization prupose; you can close this function by setting 0
Any ideas what could be wrong?
@mzahana
body_T_cam0
should not be a identity matrix. It is the extrinsic matrix between cam0 to imu. Identity matrix indicates the imu and cam0 are overlapped, which is impossible.
You need to check where the identity matrix comes from. (if you followed my blog carefully, it won't happened. HAHA)
@LuciousZhang After fixing the tranfomation matrices, the estimates still diverges fast. It initially follows the camera motion, but only for a brief amount of time, and then diverges fast with nan values on the screen. Please see this issue. I created a separate issue as the issue may also relate to devices other than T265.
Cheers.
Hi all,
For those still facing the NaNs problem when using a Fisheye lens, I figured the algorithm produces a NaN when undistorting the points in feature_tracker.cpp. I have added a NaN check in order to remove the points with NaNs and now the algorithm can work with the T265 and the default camera params from Intel T65.
This is the pull request.
@hridaybavle I cloned the newest code which includes "assert(id_pts.second[0].first == 0);",but I still got this problem by using t265 fisheye. frustrating-.- .
Hi guys, was it necessary to turn off the auto-exposure in T265 for it to work properly?
Hello, should be autoexposure turn off? Through Realviewer or something different?
@shubhamwagh when using the KANNALA_BRANDT
model, the parameters must be specified as follows:
%YAML:1.0
---
model_type: KANNALA_BRANDT
camera_name: <camera_name_val>
image_width: <image_width_val>
image_height: <image_height_val>
projection_parameters:
k2: <k2_val>
k3: <k3_val>
k4: <k4_val>
k5: <k5_val>
mu: <mu_val>
mv: <mv_val>
u0: <u0_val>
v0: <v0_val>
See here for reference.
Unfortunately, the VINS-Fusion code does not do any checking to make sure the user has specified the configuration parameters properly, so if they are not in this format, then the distortion parameters will be saved as zeros, which will result in nan
values.
You may try this fixup https://github.com/HKUST-Aerial-Robotics/VINS-Fusion/pull/168, adding the fisheye mask helps reduce the large error in the fisheye edge and improve robustness, even with the KANNALA_BRANDT
model.
I met the same problem and solve it.
@hridaybavle I cloned the newest code which includes "assert(id_pts.second[0].first == 0);",but I still got this problem by using t265 fisheye. frustrating-.- . I have also encountered the same problem as you. Have you solved this problem now?
Hi! I want to compare T265's visual odometry against VINS-Fusion as I don't have ground truth for my experiments. I created the config file similar to d435i for my T265 realsense stereo fisheye camera by using KANNALA_BRANDT camera model.
First I start my realsense t265 camera launch. Then on running the
I get nan values :
May I know what is going wrong?
Following are my config files: left.yaml
right.yaml
realsense_stereo_imu_config.yaml
rs_camera.launch
Any insight will be appreciated. Thanks.