Open anishgollakota opened 4 months ago
Just right off the bat, all your matrices are identity.... which they should not after performing the Kalibr steps correctly. Maybe double-check again?
Not all of them, I assumed that there is no rotation and this can be further calibrated online. I adjusted my new kalibr_imucam_chain.yaml file: ` %YAML:1.0
cam0: T_imu_cam: #rotation from camera to IMU R_CtoI, position of camera in IMU p_CinI
My Scenario: I have a drone that must travel at fast speeds. The drone has an IMU pointing forward and a camera pointing downwards looking at the ground.
It isn't clear to me how you are getting IMU information (angular velocity and linear accelerations). A reasonable calibration is needed for both the extrinsics, and intrinsics of both sensors (inertial and camera). There is a whole guide on how to do this here: https://docs.openvins.com/gs-calibration.html
Additionally, you need good timestamps for both sensors, from what I have seen of PX4, it isn't clear how you get the camera and IMU in the same clock frame.
If you are able to perform calibration of the camera + IMU pair, then you should expect to do state estimation, but if you are unable to do this then it isn't expected to be able to run VIO / SLAM on the sensors you have. I would try using Kalibr to do calibration first and go from there.
Hi,
I am running VIO on a custom dataset but it is diverging to infinity. My scenario is that I have a drone with a downward facing camera. I am running this drone using PX4, so I am attempting to convert the PX4 data messages of VehicleAcceleration, VehicleAttitude, and VehicleAngularVelocity to sensor_msg/Imu for OpenVINS. I set the covariance to 0 for each of these measurements.
Here is my _estimatorconfig.yaml:
The _kalibr_imuchain.yaml:
The _kalibr_imucamchain.yaml:
@goldbattle or anyone that can help. Any suggestions given my scenario? Is ground truth messing the trajectory up?