OSVR / OSVR-Core

The core libraries, applications, and plugins of the OSVR software platform.
Apache License 2.0
329 stars 124 forks source link

IR Camera Tracking Drift/Jumps #364

Open marquistan opened 8 years ago

marquistan commented 8 years ago

I found the following issues in positional tracking that can make the experience better.

When the camera faces the HMD moving from the front LEDs to the side LEDs, and vice versa, the camera has a small jump when it resets. I assume the algorithm can be tweaked to make this smoother - at the moment this seems to be the major cause of the positional tracking jumps.

Second is the IR camera tracking drift. It happens when the HMD falls out of bounds of the camera range, I think the algorithm extrapolates the movement via velocity, which then makes the HMD move all the way. Also happens when you look left really quickly (WRT the camera) and it enters that area where the camera doesn't see the LEDs.

rpavlik commented 8 years ago

Thanks for the feedback!

I'm cc'ing some other folks who've touched tracking that might be interested in my response below, which turned out to be longer and more of a write-up than strictly a response to an issue report - perhaps it is a whitepaper or blog post in the making. @VRguy @russell-taylor

Some info that might be helpful:

The camera does not reset at any point in the algorithm. The closest the tracking system gets to "resetting" is when it reverts to RANSAC from its normal SCAAT Kalman operating mode due to a number of possible reasons (losing sight of all beacons for some time, noticing that it's been making really bad predictions for half a second straight, excluding all visible beacons as "implausible" for an extensive period of time, etc) - indicated by a message about "losing fix". You shouldn't be seeing such a message when going from front to side LEDs - if you are, then your unit is really, really out of spec, or something else is different between your units and my unit that we probably need to discuss through other channels. There also isn't a specific "switching" from front to side LEDs explicitly handled - when in stable tracking, each recognized beacon that meets the criteria (emission direction sufficiently toward the camera, etc.) is filtered in to correct the state prediction individually. So, ideally there are several frames where both front and side beacons meet the criteria.

Due to manufacturing variances, beacon light transmission through the faceplate, presumably the fact that the faceplate and body of the HMD are not permanently fixed together (hackability is good!), and other unknown issues (the side LEDs have always seemed to be a bit of a thorn in my side, which would be a logical place :stuck_out_tongue_winking_eye:) , the side LEDs do particularly benefit from pre-calibration beyond the typical autocalibration that might happen during gameplay. Make sure you've followed the steps at https://github.com/OSVR/OSVR-Docs/blob/master/Getting-Started/HDK/Video-Based-Tracking-Calibration.md for beacon pre-calibration, especially being sure to watch the embedded video. With calibration, style is almost everything: you want smooth moves keeping lots of beacons recognized and contributing to the pose as you bring beacons in and out of sight so the algorithm can (implicitly) register their location with respect to each other.

Any specific algorithmic suggestions you might have would be appreciated - comments in the code have citations to the relevant academic literature to help you get your footing :wink: And, seriously, if do have any background in tracking/filtering algorithms or know somebody who does, do feel free to make the connections, since this is a real open source community. (I'm sure there are probably improvements that can be made, especially to the heuristics applied in choosing which beacons are eligible to use and which beacons are used but receive penalties (in the form of variance multipliers) and the nature of those penalties. Those are definitely going "off-book" in terms of moving from academic literature of theory to production implementations.)

Re: so-called "camera tracking drift" - rest assured, the camera is not drifting. I prefer to refer to this phenomenon as "coasting", and reserve "drift" for things like "yaw drift" (in IMUs/gyroscopes) where direct or reasonably direct measurements that are supposed to be constant end up having a change as a function of time. What you're seeing is expected (in some sense) behavior, but let me explain. When in normal SCAAT Kalman operating mode, the process model ("what the tracking algorithm assumes your movement will behave like") is a constant-velocity model with exponential damping (that assumes white noise/random walk for your random accelerations, to speak imprecisely). Constant velocity models (even without damping, the presence of which reduces coasting) are pretty typical for this type of tracking, and something like them is required to make the whole system actually work.

Elaboration: Why are they required? Well...

  • You have to be able to predict where the tracked device will be,
  • in order to predict what the beacon measurement will be,
  • to compare (delta/measurement innovation) to what the beacon measurement actually was,
  • and to be able to propagate that delta or innovation, along with the measurement's covariance, back from measurement space,
  • through a bunch of linear algebra (including Jacobian matrices, since we're modeling a non-linear system - hooray for 3d rotation being nonlinear) that I'll spare you the details of, into a state-space correction and an update to the state error-covariance matrix,
  • finally giving us a new state (pose) estimate complete with an updated estimate of its uncertainty (as well as the updated velocity estimates required both to do it again and to perform predictive tracking outside of the main filter). :tada:

And that, my friends, is a condensed version of how an extended Kalman filter (EKF), and specifically the SCAAT Kalman filter in our tracker (excluding the part that does beacon autocalibration, which augments the state vector with beacon position as well), works, in a single exceedingly-long sentence. :grin:

So yes, it does extrapolate the movement via velocity. There is an exponential decay/damping factor applied so you don't go flying forever: the default is 0.9 for both linear and angular velocity, which means that the velocity after $t$ seconds is $0.9^t$ times what it was at the start (so 0.9 times after one second, 0.9^2 after two, etc.). We did initially have these much lower (as low as 0.1 or 0.01, iirc), but it negatively impacted active tracking (since it was basically turning it into a constant-position model: assuming your head was still). You can easily tweak these from the config file (and without any academic background in tracking or filters - these were just guesses between 0 and 1 that seemed to work well): if you find a better default, please let us know and we can incorporate that. (There's a lot of parameter tuning in these filters, and the parameters interact, which is part of what makes them a challenge.) Here's a sample chunk of the config file, hopefully you can see where it fits in to an existing config file so you can add the the two fields you're wanting to edit: linearVelocityDecayCoefficient and angularVelocityDecayCoefficient. The description of them are https://github.com/OSVR/OSVR-Core/blob/master/plugins/videobasedtracker/Types.h#L117 - note that you probably do want to keep the linear and angular ones equal (or nearly equal) to each other, since they interact, but there's no harm in experimenting. I'd just try each new attempt with Tracker Viewer before a game in case unexpected instability occurs.

        "params": {
            "showDebug": false,
            "includeRearPanel": true,
            "headCircumference": 55.75,
            "linearVelocityDecayCoefficient": 0.9,
            "angularVelocityDecayCoefficient": 0.9,
            "calibrationFile": "videotrackerCombinedCalibrationFile13.json"
        }

You'll note that you do stop coasting before the exponential decay would bring your velocity to zero - that's because one of the aforementioned "reset to RANSAC" triggers kicks in after some time of not seeing any beacons and says that the Kalman algorithm is no longer to be trusted to generate pose estimates. Figuring out an improved algorithm/heuristic for how/when to increase velocity decay and/or stop prediction altogether is an open question, and probably deserves its own issue, though with a unified IMU and video-based single-filter implementation as a probable future direction, there will be a single process model incorporating both the LED beacons and the IMU data, so we won't want to necessarily apply something to the entire model when it's just one source of data that has gone absent. And, of course, as soon as the camera sees a beacon again, it should stop coasting: either the Kalman filter is sufficiently confident in its ability to regain stable tracking that it goes right back into steady-state, or the system was waiting for a beacon to appear before resetting to RANSAC to acquire a new pose estimate entirely from scratch. You should only coast if the camera can't see any (enabled) beacons. (see the below for more on that little "enabled" caveat)

Beyond tweaking the decay parameters or some potential algorithmic contributions :wink: : there are some improvements recently merged to master that will improve the behavior and maintain positional tracking longer when looking left. We had been running with the right-side LEDs disabled for pose-estimation purposes up until now because they were too flawed and and using them introduced substantial tracking artifacts (for details, ask @VRguy on a video call and be prepared with a screen capture tool - they threatened to turn my demonstration of its behavior over skype into a GIF) However, improvements in the math and tracking code minimized/changed the nature of the artifacts, allowing us to turn those back on, so turning your head left won't cause you to "coast" as easily (since you'll still have active beacons in sight). It may also improve positional tracking smoothness turning the other direction as well - I haven't compared the new code to the pre-merge code head-to-head recently. Though these updates have been merged, problems with the CI server have prevented an updated snapshot from coming out of the pipeline. A version number around or greater than 0.6-1056 will include the updates, so look for that.

jcp219 commented 8 years ago

I'm on Snapshot-v0.6-1068 with an HDK 1.3 and I've noticed similar activity to what marquistan reported. I've tested with and without pre-calibration using the pre-calibration utility. If I enable debug mode in the server config json, I can see the tracking drop in the console whenever I switch from the front LED array to either the sides or back. It may also be distance-related. I've found that if the HMD is more than 0.6M-0.9M away from the IR Camera, the side and back LED beacons are not detected at all but the front ones still are. If you get closer than 0.3M, you also lose the side and back LEDs. That leaves a narrow field of functional position tracking without constant drops.

I see the changes made to the video positioning code a few days back but the pre-calibration utility doesn't appear to have been updated, it still ignores the one side and rear LEDs, perhaps because they are still in the skip list:

    /// these are beacon IDs we disabled (1-based)
    static const auto IDS_TO_SKIP = std::unordered_set<std::size_t>{
        1,  2,  10, 12, 13,
        14, 21, 35, 38, /* plus the rest of the back */ 36,
        37, 39, 40};

Will the pre-calibration utility be updated as well to enable the side/back LEDs during calibration? Not sure if that would help at all with this issue, but figured I'd mention it.

In the meantime I'll try toying with the decay coefficients to see if I can get some better results with those.

marquistan commented 8 years ago

@jcp219 thanks for the find! Been too busy with other things to test this out. Is it possible to actually use the LEDs on the side again? If it's too much (I recall this was an issue), can we use 2 a side instead of 3 on each? See how that helps.

rpavlik commented 8 years ago

@marquistan In the recent update, as I noted earlier, we did turn on all the side LEDs again.

@jcp219 Hmm, I'm surprised you've getting such short range. The decay coefficients wouldn't affect that - that would be a lighting or image processing issue. I had been meaning to update that list in the pre-calibration utility, though I believe those are just skipped from display, not actually skipped from running calibration (the utility is just running the tracker code with a different display on it, I don't believe it's actually turning off calibration of those LEDs - doing that requires work at a different level - though I'll check it out this morning when I fix the display issue).

You don't actually want to pre-calibrate the rear LEDs because they're not rigidly fixed to the front of the HMD: their location relative to the other LEDs can change substantially every run, so pre-calibrating them would actually make tracking worse (by moving them into a less-neutral starting location), which is why they're not shown on the utility, to try to keep you from doing that. :)

marquistan commented 8 years ago

@rpavlik ah I was basing it off the calibration utility, which you clarified wasn't changed to display the other side of the LEDs - I was merely basing it on that. Thanks for the clarification!