ethz-asl / rovio

Other
1.13k stars 508 forks source link

live rovio diverging from the start #70

Closed jasonwurzel closed 8 years ago

jasonwurzel commented 8 years ago

Hi,

when I run rovio with the provided test data (Machinehall + the provided yaml configs for the camera) everything works very good and precise. I'm still waiting for my global shutter camera, so I tried to make rovio run with my logitech webcam HD Pro C920 (+tinkerforge imu brick 2.0). But right from the start rovio is totally lost. Please see the video below for a demonstration. I calibrated the system using kalibr with the usual steps. To track down the problem I also changed some parameters (like imu-cam orientation) to deliberately wrong values, but no change. Even when using a dummy-imu-node that publishes only 0 values, the result is the same.

https://www.youtube.com/watch?v=72EdG0ik-eY

anybody encountered this and can point me in some general direction? of course i can provide a bag file with some sample data.... thanks!

mhkabir commented 8 years ago

Can you try an older commit from January, and see if it works better for you?

jasonwurzel commented 8 years ago

I already roled back to commit https://github.com/ethz-asl/rovio/commit/02753db02854549c48fc3a5850a60b29feaf3dd1 ... nearly no change, except for the performance gains others have reported, in comparision to the current version

bloesch commented 8 years ago

Just a first comment using the 0 values only: you will get free fall since in stationary you should measure gravity (this should be 9.81 m/s^2 upwards).

jasonwurzel commented 8 years ago

oh, good advice, because for the imu I am publishing the filtered acceleration (gravity subtracted). I'll try that right away...

jasonwurzel commented 8 years ago

ok now at least it's stable the first 2 sec, then the position accelerates to one horizontal direction. I'll calibrate again with the changed imu-node that publishes the unfiltered acceleration...

jasonwurzel commented 8 years ago

Hi, maybe somebody could help me as I'm a bit at a loss here. Can't think of anything other but waiting for my camera from Matrix Vision to arrive... Following are my rovio.info file and the camera configuration file I'm using (both generated by kalibr)

; You can partially override parameter set in this file by creating your own subset of parameter in a separate info-file and include it with:
; #include "/home/user/workspace/rovio/cfg/rovio_custom.info"
Common
{
    doVECalibration true;       Should the camera-IMU extrinsics be calibrated online
    depthType 1;                Type of depth parametrization (0: normal, 1: inverse depth, 2: log, 3: hyperbolic)
    verbose false;              Is the verbose active
}
Camera0
{
  CalibrationFile  ;            Camera-Calibration file for intrinsics
  qCM_x  -0.621873479136                               X-entry of IMU to Camera quaterion (JPL)
  qCM_y  0.105370425312                               Y-entry of IMU to Camera quaterion (JPL)
  qCM_z  -0.00530154721573                               Z-entry of IMU to Camera quaterion (JPL)
  qCM_w  0.775978313495                               W-entry of IMU to Camera quaterion (JPL)

  MrMC_x -0.0197733763816                               X-entry of IMU to Camera vector (expressed in IMU CF) [m]
  MrMC_y -0.185821668478                               Y-entry of IMU to Camera vector (expressed in IMU CF) [m]
  MrMC_z -0.0393655046491                               Z-entry of IMU to Camera vector (expressed in IMU CF) [m]
}

Init
{
    State
    {
        pos_0 0;            X-entry of initial position (world to IMU in world) [m]
        pos_1 0;            Y-entry of initial position (world to IMU in world) [m]
        pos_2 0;            Z-entry of initial position (world to IMU in world) [m]
        vel_0 0;            X-entry of initial velocity (robocentric, IMU) [m/s]
        vel_1 0;            Y-entry of initial velocity (robocentric, IMU) [m/s]
        vel_2 0;            Z-entry of initial velocity (robocentric, IMU) [m/s]
        acb_0 0;            X-entry of accelerometer bias [m/s^2]
        acb_1 0;            Y-entry of accelerometer bias [m/s^2]
        acb_2 0;            Z-entry of accelerometer bias [m/s^2]
        gyb_0 0;            X-entry of gyroscope bias [rad/s]
        gyb_1 0;            Y-entry of gyroscope bias [rad/s]
        gyb_2 0;            Z-entry of gyroscope bias [rad/s]
        att_x 0;            X-entry of initial attitude (IMU to world, JPL)
        att_y 0;            Y-entry of initial attitude (IMU to world, JPL)
        att_z 0;            Z-entry of initial attitude (IMU to world, JPL)
        att_w 1;            W-entry of initial attitude (IMU to world, JPL)
    }
    Covariance
    {
        pos_0 0.0001;       X-Covariance of initial position [m^2]
        pos_1 0.0001;       Y-Covariance of initial position [m^2]
        pos_2 0.0001;       Z-Covariance of initial position [m^2]
        vel_0 1.0;          X-Covariance of initial velocity [m^2/s^2]
        vel_1 1.0;          Y-Covariance of initial velocity [m^2/s^2]
        vel_2 1.0;          Z-Covariance of initial velocity [m^2/s^2]
        acb_0 4e-4;         X-Covariance of initial accelerometer bias [m^2/s^4]
        acb_1 4e-4;         Y-Covariance of initial accelerometer bias [m^2/s^4]
        acb_2 4e-4;         Z-Covariance of initial accelerometer bias [m^2/s^4]
        gyb_0 3e-4;         X-Covariance of initial gyroscope bias [rad^2/s^2]
        gyb_1 3e-4;         Y-Covariance of initial gyroscope bias [rad^2/s^2]
        gyb_2 3e-4;         Z-Covariance of initial gyroscope bias [rad^2/s^2]
        vep 0.0001;         Covariance of initial linear camera-IMU extrinsics, same for all entries [m^2]
        att_0 0.1;          X-Covariance of initial attitude [rad^2]
        att_1 0.1;          Y-Covariance of initial attitude [rad^2]
        att_2 0.1;          Z-Covariance of initial attitude [rad^2]
        vea 0.0001;         Covariance of initial rotational camera-IMU extrinsics, same for all entries [rad^2]
    }
}
ImgUpdate
{
    doPatchWarping true;                                        Should the patches be warped
    doFrameVisualisation true;                                  Should the frames be visualized
    visualizePatches true;                                      Should the patches be visualized
    useDirectMethod true;                                       Should the EKF-innovation be based on direct intensity error (o.w. reprojection error)
    startLevel 3;                                               Highest patch level which is being employed (must be smaller than the hardcoded template parameter)
    endLevel 1;                                                 Lowest patch level which is being employed
    nDetectionBuckets 100;                                      Number of discretization buckets used during the candidates selection
    MahalanobisTh 0.1;                                          Mahalanobis treshold for the update, 5.8858356
    UpdateNoise
    {
        nor 1e-5;                                               Covariance used for the reprojection error, 1/focal_length is roughly 1 pixel std [rad] (~1/f ~ 1/400^2 = 1/160000)
        int 100;                                                Covariance used for the photometric error [intensity^2]
    }
    initCovFeature_0 0.5;                                       Covariance for the initial distance (in the selected parametrization)
    initCovFeature_1 1e-5;                                      Covariance for the initial bearing vector, x-component [rad]
    initCovFeature_2 1e-5;                                      Covariance for the initial bearing vector, y-component [rad]
    initDepth 0.5;                                              Initial value for the initial distance parameter
    startDetectionTh 0.8;                                       Threshold for detecting new features (min: 0, max: 1)
    scoreDetectionExponent 0.25;                                Exponent used for weighting the distance between candidates
    penaltyDistance 100;                                        Maximal distance which gets penalized during bucketing [pix]
    zeroDistancePenalty 100;                                    Penalty for zero distance (max: nDetectionBuckets)
    statLocalQualityRange 10;                                   Number of frames for local quality evaluation
    statLocalVisibilityRange 100;                               Number of frames for local visibility evaluation
    statMinGlobalQualityRange 100;                              Minimum number of frames for obtaining maximal global quality
    trackingUpperBound 0.9;                                     Threshold for local quality for min overall global quality
    trackingLowerBound 0.1;                                     Threshold for local quality for max overall global quality
    minTrackedAndFreeFeatures 0.75;                             Minimum of amount of feature which are either tracked or free
    removalFactor 1.1;                                          Factor for enforcing feature removal if not enough free
    minRelativeSTScore 0.75;                                    Minimum relative ST-score for extracting new feature patch
    minAbsoluteSTScore 5.0;                                     Minimum absolute ST-score for extracting new feature patch
    minTimeBetweenPatchUpdate 1.0;                              Minimum time between new multilevel patch extrection [s]
    fastDetectionThreshold 5;                                   Fast corner detector treshold while adding new feature
    alignConvergencePixelRange 1;                               Assumed convergence range for image alignment (gets scaled with the level) [pixels]
    alignCoverageRatio 2;                                       How many sigma of the uncertainty should be covered in the adaptive alignement
    alignMaxUniSample 1;                                        Maximal number of alignment seeds on one side -> total number of sample = 2n+1. Carefull can get very costly if diverging!
    matchingPixelThreshold 0.0;                                 Below this treshold no pre-alignment is carried out [pixels]
    minNoAlignment 5;                                           Minimal number of alignment every feature must make through.
    specialLinearizationThreshold 0.0;                          Below this treshold no special linearization point gets computed (1 pixel ~ 0.0025) [rad]
    patchRejectionTh -1;                                        If the average itensity error is larger than this the feauture is rejected [intensity], if smaller 0 the no check is performed
    bearingVectorMahalTh 9.21;                                  Threshold for the aligned patch to be accepted as inlier
    removeNegativeFeatureAfterUpdate true;                      Should feature with negative distance get removed
    maxUncertaintyToDepthRatioForDepthInitialization 0.3;       If set to 0.0 the depth is initialized with the standard value provided above, otherwise ROVIO attempts to figure out a median depth in each frame
    useCrossCameraMeasurements true;                            Should cross measurements between frame be used. Might be turned of in calibration phase.
    doStereoInitialization true;                                Should a stereo match be used for feature initialization.
    MotionDetection
    {
        isEnabled 0;                                            Is the motion detection enabled
        rateOfMovingFeaturesTh 0.5;                             Amount of feature with motion for overall motion detection
        pixelCoordinateMotionTh 1.0;                            Threshold for motion detection for patched [pixels]
        minFeatureCountForNoMotionDetection 5;                  Min feature count in frame for motion detection
    }
    ZeroVelocityUpdate
    {
        UpdateNoise
        {
            vel_0 0.01;                                         X-Covariance of zero velocity update [m^2/s^2]
            vel_1 0.01;                                         Y-Covariance of zero velocity update [m^2/s^2]
            vel_2 0.01;                                         Z-Covariance of zero velocity update [m^2/s^2]
        }
        MahalanobisTh0 7.689997599999999;                       Mahalanobid distance for zero velocity updates
        minNoMotionTime 1.0;                                    Min duration of no-motion
        isEnabled 0;                                            Should zero velocity update be applied, only works if MotionDetection.isEnabled is true
    }
}
Prediction
{
    PredictionNoise
    {
        pos_0 1e-4;                             X-covariance parameter of position prediction [m^2/s]
        pos_1 1e-4;                             Y-covariance parameter of position prediction [m^2/s]
        pos_2 1e-4;                             Z-covariance parameter of position prediction [m^2/s]
        vel_0 4e-5; 4e-6                        X-covariance parameter of velocity prediction [m^2/s^3]
        vel_1 4e-5;                             Y-covariance parameter of velocity prediction [m^2/s^3]
        vel_2 4e-5;                             Z-covariance parameter of velocity prediction [m^2/s^3]
        acb_0 1e-8;                             X-covariance parameter of accelerometer bias prediction [m^2/s^5]
        acb_1 1e-8;                             Y-covariance parameter of accelerometer bias prediction [m^2/s^5]
        acb_2 1e-8;                             Z-covariance parameter of accelerometer bias prediction [m^2/s^5]
        gyb_0 3.8e-7;                           X-covariance parameter of gyroscope bias prediction [rad^2/s^3]
        gyb_1 3.8e-7;                           Y-covariance parameter of gyroscope bias prediction [rad^2/s^3]
        gyb_2 3.8e-7;                           Z-covariance parameter of gyroscope bias prediction [rad^2/s^3]
        vep 1e-4;                               Covariance parameter of linear extrinsics prediction [m^2/s]
        att_0 7.6e-6; 7.6e-7                    X-Covariance parameter of attitude prediction [rad^2/s]
        att_1 7.6e-6;                           Y-Covariance parameter of attitude prediction [rad^2/s]
        att_2 7.6e-6;                           Z-Covariance parameter of attitude prediction [rad^2/s]
        vea 1e-6;                               Covariance parameter of rotational extrinsics prediction [rad^2/s]
        dep 0.0001;                         Covariance parameter of distance prediction [m^2/s]
        nor 0.00001;                                Covariance parameter of bearing vector prediction [rad^2/s]
    }
    MotionDetection
    {
        inertialMotionRorTh 0.1;                Treshold on rotational rate for motion detection [rad/s]
        inertialMotionAccTh 0.5;                Treshold on acceleration for motion detection [m/s^2]
    }
}
PoseUpdate
{
    UpdateNoise
    {
        pos_0 0.01;                         X-Covariance of linear pose measurement update [m^2]
        pos_1 0.01;                         Y-Covariance of linear pose measurement update [m^2]
        pos_2 0.01;                         Z-Covariance of linear pose measurement update [m^2]
        att_0 0.01;                         X-Covariance of rotational pose measurement update [rad^2]
        att_1 0.01;                         Y-Covariance of rotational pose measurement update [rad^2]
        att_2 0.01;                         Z-Covariance of rotational pose measurement update [rad^2]
    }
    init_cov_IrIW 1;                            Covariance of initial pose between inertial frames, linear part [m^2]
    init_cov_qWI 1;                             Covariance of initial pose between inertial frames, rotational part [rad^2]
    init_cov_MrMV 1;                            Covariance of initial pose between inertial frames, linear part [m^2]
    init_cov_qVM 1;                             Covariance of initial pose between inertial frames, rotational part [rad^2]
    pre_cov_IrIW 1e-4;                          Covariance parameter of pose between inertial frames, linear part [m^2/s]
    pre_cov_qWI 1e-4;                           Covariance parameter of pose between inertial frames, rotational part [rad^2/s]
    pre_cov_MrMV 1e-4;                          Covariance parameter of pose between inertial frames, linear part [m^2/s]
    pre_cov_qVM 1e-4;                           Covariance parameter of pose between inertial frames, rotational part [rad^2/s]
    MahalanobisTh0 12.6511204;                  Mahalonobis distance treshold of pose measurement
    doVisualization false;                      Should the measured poses be vizualized
    enablePosition true;                        Should the linear part be used during the update
    enableAttitude true;                        Should the rotation part be used during the update (e.g. set to fals eif feeding GPS-measurement)
    noFeedbackToRovio true;                     By default the pose update is only used for aligning coordinate frame. Set to false if ROVIO should benefit frome the posed measurements.
    doInertialAlignmentAtStart true;            Should the transformation between I and W be explicitly computed and set with the first pose measurement.
    timeOffset 0.0;                             Time offset added to the pose measurement timestamps
    qVM_x 0;                                    X-entry of quaterion representing IMU to reference body coordinate frame of pose measurement (JPL)
    qVM_y 0;                                    Y-entry of quaterion representing IMU to reference body coordinate frame of pose measurement (JPL)
    qVM_z 0;                                    Z-entry of quaterion representing IMU to reference body coordinate frame of pose measurement (JPL)
    qVM_w 1;                                    W-entry of quaterion representing IMU to reference body coordinate frame of pose measurement (JPL)
    MrMV_x 0;                                   X-entry of vector representing IMU to reference body coordinate frame of pose measurement
    MrMV_y 0;                                   Y-entry of vector representing IMU to reference body coordinate frame of pose measurement
    MrMV_z 0;                                   Z-entry of vector representing IMU to reference body coordinate frame of pose measurement
    qWI_x 0;                                    X-entry of quaterion representing World to reference inertial coordinate frame of pose measurement (JPL)
    qWI_y 0;                                    Y-entry of quaterion representing World to reference inertial coordinate frame of pose measurement (JPL)
    qWI_z 0;                                    Z-entry of quaterion representing World to reference inertial coordinate frame of pose measurement (JPL)
    qWI_w 1;                                    W-entry of quaterion representing World to reference inertial coordinate frame of pose measurement (JPL)
    IrIW_x 0;                                   X-entry of vector representing World to reference inertial coordinate frame of pose measurement
    IrIW_y 0;                                   Y-entry of vector representing World to reference inertial coordinate frame of pose measurement
    IrIW_z 0;                                   Z-entry of vector representing World to reference inertial coordinate frame of pose measurement
}
image_width: 800
image_height: 600
camera_name: cam0
camera_matrix:
  rows: 3
  cols: 3
  data: [774.216545383, 0.0, 390.043939379, 0.0, 777.077483049, 293.902685839, 0.0, 0.0, 1.0]
distortion_model: equidistant
distortion_coefficients:
  rows: 1
  cols: 4
  data: [0.434872119883, 0.00364157636076, 0.363743926845, -1.67527377644]

The bag file I used for cam-imu calibration is still uploading, I'll post a link later. I'd be SO thankful for any hints, it's taking me already much too long.

helenol commented 8 years ago

Quick question: how well did the intrinsic calibration work? Those look like huge coefficients for the equidistant model (our cameras usually have a maximum magnitude of 0.01 for any of the parameters). Could you try with a radtan undistortion? Or is it a huge FoV?

jasonwurzel commented 8 years ago

so, here's a link to the bag file: https://1drv.ms/u/s!Arpvxlo7zlhziNNohhaiQut12qXRcQ

@helenol thanks for the tip. So I tried the calibration again with pinhole-radtan:

image_width: 800
image_height: 600
camera_name: cam0
camera_matrix:
  rows: 3
  cols: 3
  data: [769.746276892, 0.0, 390.134873144, 0.0, 776.105936467, 299.509564412, 0.0, 0.0, 1.0]
distortion_model: plumb_bob
distortion_coefficients:
  rows: 1
  cols: 5
  data: [0.108740499693, -0.204898397327, -0.00408912036113, 0.00113559923836, 0]

the Diagonal FOV is specified as 78° so, not that huge. With the above cam.yaml configuration it's no change.

jasonwurzel commented 8 years ago

ok, I feel like a spammer in my own thread ;-) so once again I modified my imu-node to publish 0 values only (except for the gravity vector, in my case the z axis). And voila, the filter is totally stable, of course when I move the camera, it diverges because the observed movements don't match the imu "measurements"...so, it must be something with my imu-node.

jasonwurzel commented 8 years ago

One interesting thing: I just changed the parameters for the covariance matrices of orientation, angular velocity, linear accelleration published by my imu_node. I compared the imu-messages from one of the machine hall datasets with mine. For the messages recorded in the machine hall bag the covariances where:

orientation_covariance: [99999.9, 0.0, 0.0, 0.0, 99999.9, 0.0, 0.0, 0.0, 99999.9]
angular_velocity_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
linear_acceleration_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]

additionaly, the orientation was 0,0,0,1

for my messages it was:

orientation_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
linear_acceleration_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]

and my published orientation was some real value representing the orientation:

  x: -0.014710370506
  y: 0.0164194591955
  z: 0.0
  w: 0.999816883355

after I changed my imu_node to publish the values the same way (no orientation, covariances the same as in the MH dataset) being stationary, the filter seems stable. BUT as you can see in the video below, as soon as I just tap the desk, the system is lying on or even tap on the system itself, the filter gets out of bounds very quickly: https://www.youtube.com/watch?v=X7simma-mjU&feature=youtu.be

bloesch commented 8 years ago

The filter should not be reading the coriances or the estimated attitude from the imu msgs - it only processes the acceleration and rotational rate measurements. If rovio diverges once you move the sensor this is typically due to innacurate calibration of extrisincs/intrisics or to bad time aligment of the measurements. Have you tried inverting the quaternion qCM (flipping the sign of w)?

Concerning the time alignment of IMU and camera: how are you synchronizing the msgs? It is very important that the timestamps in the headers of the imu and image msgs are expressed with respect to the same clock.

cynosure4sure commented 8 years ago

@jasonwurzel I had a similar problem where rovio diverged from the start(although a bit slower than in your video) what worked for me was increasing the covariances for the IMU under PredictionNoise and Init. The VI sensor has a high quality IMU but I was using a cheap IMU so I had to adjust the covariances accordingly. Hope it works for you.

jasonwurzel commented 8 years ago

thank you for the responses! @bloesch yes, that must have been a coincidence (concerning my change of not publishing orientation etc), because first I tested with my calibration bag which immediately starts with movement as later on I tested on my live system, lying stationary... regarding the timing: both the camera as well as the imu node get their timings from the ros clock. It doesn't matter that the measurements are not exactly synchronous, does it? regarding qCM: I already tried that, yes.

@cynosure4sure thanks for the hint. To what values did you raise the covariances and what IMU were you using? I'm using a Bosch BNO055

bloesch commented 8 years ago

I think people have managed running rovio without exactly synchronizing the measurements. But the performance will drop quickly if timing innacurracies are large than a couple of ms (this depends on the actual motion you have).

@cynosure4sure added a good comment. Maybe start by trying to increase the accelerometer covariance (PredictionNoise.vel) and the gyroscope covariance (PredictionNoise.att) by a factor 100. Depending on the behavior of the biases you might want to increase the corresponding terms as well (range of biases in Init.Covariance.acb/gyb and rate of bias changes in PredictionNoise.acb/gyb).

cynosure4sure commented 8 years ago

@jasonwurzel I am using a UM7 by CH-robotics. In my experience for drift from the beginning covariance values under Init had more of an effect. Anyway my adjusted values for this sensor are as following: Init { Covariance { pos_0 0.001; X-Covariance of initial position [m^2] pos_1 0.001; Y-Covariance of initial position [m^2] pos_2 0.001; Z-Covariance of initial position [m^2] vel_0 1.0; X-Covariance of initial velocity [m^2/s^2] vel_1 1.0; Y-Covariance of initial velocity [m^2/s^2] vel_2 1.0; Z-Covariance of initial velocity [m^2/s^2] acb_0 4e-3; X-Covariance of initial accelerometer bias [m^2/s^4] acb_1 4e-3; Y-Covariance of initial accelerometer bias [m^2/s^4] acb_2 4e-3; Z-Covariance of initial accelerometer bias [m^2/s^4] gyb_0 3e-3; X-Covariance of initial gyroscope bias [rad^2/s^2] gyb_1 3e-3; Y-Covariance of initial gyroscope bias [rad^2/s^2] gyb_2 3e-3; Z-Covariance of initial gyroscope bias [rad^2/s^2] vep 0.001; Covariance of initial linear camera-IMU extrinsics, same for all entries [m^2] att_0 0.1; X-Covariance of initial attitude [rad^2] att_1 0.1; Y-Covariance of initial attitude [rad^2] att_2 0.1; Z-Covariance of initial attitude [rad^2] vea 0.001; Covariance of initial rotational camera-IMU extrinsics, same for all entries [rad^2] } }

PredictionNoise { pos_0 1e-4; X-covariance parameter of position prediction [m^2/s] pos_1 1e-4; Y-covariance parameter of position prediction [m^2/s] pos_2 1e-4; Z-covariance parameter of position prediction [m^2/s] vel_0 6.25e-3; 4e-6 X-covariance parameter of velocity prediction [m^2/s^3] vel_1 6.25e-3; Y-covariance parameter of velocity prediction [m^2/s^3] vel_2 6.25e-3; Z-covariance parameter of velocity prediction [m^2/s^3] acb_0 2.025e-7; X-covariance parameter of accelerometer bias prediction [m^2/s^5] acb_1 2.025e-7; Y-covariance parameter of accelerometer bias prediction [m^2/s^5] acb_2 2.025e-7; Z-covariance parameter of accelerometer bias prediction [m^2/s^5] gyb_0 1.5625e-7; X-covariance parameter of gyroscope bias prediction [rad^2/s^3] gyb_1 1.5625e-7; Y-covariance parameter of gyroscope bias prediction [rad^2/s^3] gyb_2 1.5625e-7; Z-covariance parameter of gyroscope bias prediction [rad^2/s^3] vep 1e-4; Covariance parameter of linear extrinsics prediction [m^2/s] att_0 2.5e-5; 7.6e-7 X-Covariance parameter of attitude prediction [rad^2/s] att_1 2.5e-5; Y-Covariance parameter of attitude prediction [rad^2/s] att_2 2.5e-5; Z-Covariance parameter of attitude prediction [rad^2/s] vea 1e-6; Covariance parameter of rotational extrinsics prediction [rad^2/s] dep 0.0001; Covariance parameter of distance prediction [m^2/s] nor 0.00001; Covariance parameter of bearing vector prediction [rad^2/s] }

jasonwurzel commented 8 years ago

thank you for the comments! In the meantime my nice blueFox cam has arrived, so I'll try with an approximation of cynosure4sure's parameters and the new camera. @bloesch: what would be a sufficiently accurate imu with good characteristics for rovio in your opinion?

mhkabir commented 8 years ago

@jasonwurzel I've found that Rovio works great even with the cheapest of IMUs e.g MPU9250, MPU6000, etc. as long as the camera-IMU sync is precise.

jasonwurzel commented 8 years ago

@mhkabir how do you sync your messages, in software/ros node? or low-level/hardware?

mhkabir commented 8 years ago

Both. I designed an interface in PX4 Autopilot for the hardware triggering : http://dev.px4.io/advanced-camera-trigger.html

bloesch commented 8 years ago

I have been working with vi-sensor coupled IMUs so far and do not have too much experience with other setups. However, others have tried different setups and it seemed to work nicely (@mhkabir thanks for the comments). I also agree that a proper hardware time synchronization will make a bigger difference in terms of accuracy/robustness then a high end IMU.

jasonwurzel commented 8 years ago

Hi everybody, it's been a while. I think I found the most significant of the errors I made: my imu-node did the angular rotation output in degree, not in rad. That also explains the behavoir that sometimes I could get a stable initialization but as soon as I touched the sensors, everything went overboard because the angular measurements where much too high. So now I have a very stable system, that outputs the orientation very precise... but: movement nearly doesn't get registered at all. So I move the sensor 50cm but the visualisation stays (almost) still. Also to be observed is that in the scene the features are right in front of the imu-camera (inside the displayed coordinate system, even). My first guess now is, that now my accelerometer biases are much too high...

bloesch commented 8 years ago

Some comments to your last thoughts: the sensor motion should defintely get registered and the features are not suposed to lie within the coordinate frame (except if your observed scene is very close). If I am not wrong the spacing of the grid is 1m.

The convergence of the accelerometer biases depends on the provided covariance parameters in the info file and on the excitation of the system. I am not too sure if badly tuned accelerometer covariance values can lead to the described behavior. If you excite the system the covariances should get down eventually and you should start observing motion (even if you choosed large covariance parameters). Best is to try staying near the original configuration and only adapt as few parameters as possible (for you this would mainly involve imu related covariance parameters).

jasonwurzel commented 8 years ago

well, it got registered, but it looked like with a false scaling. further down my journey, I bettered the timings by a lot, now the difference between the imu and the camera messages is < 0.1ms. A mediocre timing, as it was before could have led to the wrong calculated camera-imu extrinsics (doVECalibration) and in turn that could have led to the feature depths being incorrectly estimated. Could that be? Even when I moved the system for 30 sec, it still did not change the wrong estimations before. Anyway, now with the (more) synced timings things look a lot better now, the features' displayed distances look a lot better/correct. Now I'm just waiting for my Epson G364, then things should be even more precise =) Thank you a lot for your support!

bloesch commented 8 years ago

Bad timing definitely causes various erroneous behaviors. If it is too bad I tipically observed divergence. 0.1ms synchonization accuracy sounds reasonable to me. I am glad things look better now!