ethz-asl / okvis

OKVIS: Open Keyframe-based Visual-Inertial SLAM.
Other
1.25k stars 538 forks source link

Tracking failure with mono visual-inertial data captured from low-cost sensors #12

Open agnivsen opened 8 years ago

agnivsen commented 8 years ago

Trying to run OKVIS on a dataset collected from a hand-held mobile device. Tracking fails and pose estimate drifts very rapidly.

The camera-IMU calibration has been done using Kalibr tookit (wherein the values of camera reprojection error, gyro and acceleration error squared have all converged to a value of less than 2.5). The IMU bias and random walk have been derived from analysis of Allan variance plots.

However, when we run OKVIS on such dataset, the tracking fails almost always. The error message reads:

W0531 15:28:03.059871 1646 Frontend.cpp:192] Tracking failure. Number of 3d2d-matches: 0.

The published state (pose) drifts off to a very large value. The camera runs at 30 fps and IMU at 100 Hz.

PFA, the config.yaml file. config_fpga_p2_euroc_iPhone.yaml.txt

To our understanding, most of the mobile devices come with hideous IMUs. However, I would like to know if the tracking failure can be attributed to solely the poor quality of IMU? Or is there something else we are missing out here? (Lack of global shutter might not be much of a concern, the camera motion is quite slow and captured in well lit environment).

v0n0 commented 8 years ago

@agnivsen curious, are you using latest iPhone? How did you use Kalibr with the device? Did you just record the dataset on the device for calibration/testing or did you run the whole thing on it?

I am not the author, but it seems that the PnP algorithm is failing to find any correspondences. I would check that initialization is not failing first.

agnivsen commented 8 years ago

1) Using iPhone 6. Also tried this on iPad Air 2 - same results. 2) Yes, recorded the data and ran Kalibr offline. 3) Does not look like initialization is failing. It will typically run for the first 4 - 5 seconds without complaining or drifting. After that, it will suddenly start drifting and throwing these errors.

v0n0 commented 8 years ago

What I suggest is to try with one of the datasets here first: http://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets

agnivsen commented 8 years ago

Tried OKVIS on the ETHZ MAV datasets, of course. They work fine. Sorry for neglecting to mention that specifically, in the OP.

v0n0 commented 8 years ago

What's the code that you use to run OKVIS? Did you fix exposure and focus on the iPhone camera? How accurate are your timestamps?

agnivsen commented 8 years ago

We are using OKVIS source code from the Master branch. What code do you want to know about?

Exposure and focus is fixed, yes. So is white-balance.

The accelerometer and gyroscope data is being logged with microsecond precision. When we tried this on Android, we managed to log IMU data with nanosecond precise timestamps. The data is being stored in a large buffer and being written to file at the end of the data collection routine.

The images, however, comes from a video file. Only the start time of this video is known with micro/nano second precision. The remaining frames are extrapolated for timestamps.

v0n0 commented 8 years ago

Are you using the example app? Can you post the full log on a gist?

liyinnb commented 7 years ago

I have not proved this, but could it be because that the imu readings and camera frames from mobile devices are not time-synchronized? You may need to interpolate the imu readings by time to match the camera frame time.

agnivsen commented 7 years ago

We tried interpolation. Interpolation up to millisecond-level precision has been done. It improves the amount of time spent before it loses tracking. But only by a small margin. Without interpolation, most of the dataset loses tracking after 4-5 seconds approx. With interpolation, it would lose tracking after 6-7 second. That does point to the fact that better interpolation or time synchronization would lead to better results.

But here's the catch. We are on a mobile device. We can either dump every frame as image, which would incur the additional overhead of IO, thereby causing the system to lag. Or, we can store the images in the buffer memory, which would cause huge memory issues in most devices. Or, we can store it as video, in which case we get the start time of the video with microsecond precision, but nothing thereafter. All the subsequent frames will have a derived timestamp, based on the supposed frame rate, which is never very accurate after a couple of seconds.

Nevertheless, we tried all three approaches and none of them seems to work robustly.

higerra commented 7 years ago

Hi, agnivsen, I'm also working on OKVIS on iphone (6s). I developed a custom camera app to collect data. I processed each video frame individually and tag the precise time stamp (in nanoseconds) (technically, use AVCaptureVideoDataOutput instead of AVCaptureMovieFileOutput, frames are then written to the storage by AVAssetWriter).

But, I haven't made it work either. The tracking makes no sense. To me the question seems to be the gravity.

What data did you record from IMU? The raw sensor reading or processed reading (subtract gravity)? I have checked the source code, seems that the algorithm will subtract the gravity internally (there is also a field for gravity in the configuration file), but the IMU data does not contain the real-time gravity direction. I'm curious how the algorithm knows the gravity direction w.r.t. the sensor. Does it assume the device will always start strictly horizontally?

agnivsen commented 7 years ago

Which field of the configuration file indicates direction of gravity? (Sorry, if I missed it).

If there is a random subtraction of gravity internally, it is worrying. That could very well be the case why it havent been working for many of us. Can you please point to the section of the code where this is happening?

We have previously tried both processed and raw IMU data, none of that worked.