pushrax / OpenVR-SpaceCalibrator

Use tracked VR devices from one company with any other.
MIT License
668 stars 110 forks source link

Perceivable "lag" when using controllers #6

Open deinlandel opened 5 years ago

deinlandel commented 5 years ago

There's a slight perceivable "lag" when using Vive Wands with WMR headset. It almost unnoticable in usual situations, but very noticeable in games requiring fast and precise controller movement, for example Beat Saber (constant note "misses"). Looks like there's some kind of position smoothing which happens when offsets are applied to controllers.

NoahWL commented 5 years ago

I have this too, same situations: Vive wands with WMR (Odyssey+). Except mine's very noticeable. My wands have a consistent judder to them, as if interleaved reprojection were turned on, but nothing is reported as being reprojected. This occurs even when frame times are under 2ms. I've tried a lot to fix this, including trying completely different rooms (so running through each devices' room setup), different PCs with CPUs and GPUs of different vendors, fresh install of Windows with only SteamVR and SpaceCalibrator installed, rolling back controller firmware, installing old WMR drivers, etc.

My theory is WMR screws with something and causes some... things?... to run at ~45FPS. I believe this because in Mixed Reality home, using WMR controllers, the blue teleport laser pointer beam can be observed jumping around just like the Vive wands do. But the controllers themselves are fine; it's just the laser pointers jumping around. Also, in SteamVR with the headset on, if I move a window via the SteamVR overlay's desktop view with a Vive wand, WMR controller, or even my mouse, the window "skips" around. It is as if the desktop is also being rendered at only 45FPS. The window skipping looks incredibly similar to the Vive wand and laser pointer skipping/judder.

Interestingly enough, in Pavlov VR this judder is unnoticeable. To me, it looks like Pavlov applies some sort of smoothing to the controllers, at the cost of increased latency. I know for a fact it has a latency penalty of some sort because on the Vive (my controllers do not judder at all using the Vive HMD) I always noticed a slight amount of input lag only in Pavlov VR.

If anyone has any idea on what steps could be taken to remedy this problem I'd be willing to try anything to help get it fixed. If any more information is needed, like logs or something, just let me know and I can provide them.

pushrax commented 5 years ago

This definitely seems to be a systemic issue. I'm planning to get the HP Reverb and should be able to start working on a fix then. It's possible there's an issue with motion prediction.

UeflaVR commented 5 years ago

Same problem here. The lag in beat saber is huge noticeable. Specially in fast maps. Hope we can find a solution for this.

Lomkex1 commented 4 years ago

Any update on this?

monstermac77 commented 4 years ago

Another +1 from me. Reverb G2 is coming out and a lot of people are looking to pair it with the Index controllers for the ultimate setup. https://www.youtube.com/watch?v=r_SepHooREo&feature=youtu.be

pushrax commented 4 years ago

The G2 does not appear to have this issue with the latest WinMR drivers. Beat Saber is working great.

In the previously linked YouTube video, the author also seemed to think the tracking is working correctly on their review unit.

BlueCyro commented 4 years ago

@pushrax Apologies for the ping, but are you saying that the drift between playspaces - at least on WMR - is no longer an issue for the most part?

pushrax commented 4 years ago

@RileyGuy the input latency / judder issue is what seems to be fixed somewhere in the WinMR stack.

Can you elaborate on the issue you're asking about? Drift over a long period of time? Or something like #16?

BlueCyro commented 4 years ago

@pushrax I have a friend who used to use this software and would experience drift over a couple hours until the lighthouse playspace was about 6-12cm away. This was combining a quest with vive pucks. I'm curious if the same would happen on WMR and if it's fixable without having to recalibrate after an hour or three. I saw MRTV do this with the new reverb to show that you can pair index controllers with it, but was surprised to see that nobody mentioned this drift issue.

pushrax commented 4 years ago

That particular large drift issue seems to be Quest-specific. The biggest problems seem to happen when using wireless streaming. https://github.com/pushrax/OpenVR-SpaceCalibrator/issues/15

ghost commented 4 years ago

I use a Rift S with Vive pucks as do a couple of my friends. The software does work very well, but all of us experience an issue of the SteamVR playspace "drifting" in relation to the Oculus playspace after playing for many hours (usually after removing and replacing the headset multiple times). The drift is usually only an inch or two but becomes noticable when using something like a hip tracker in VRchat as your hips are no longer centered in relation to your head. It might be fine facing one direction, but then you turn 90 degrees and your hips are left a couple inches related to your head. Turn 180 from that and then they're offset to the right. I check this right after I first calibrate and it's fine then, so it's something that seems to crop up over time. Playspace drift, at least in that amount, doesn't seem to be isolated only to the Quest.

monstermac77 commented 4 years ago

OpenVR Space Calibrator got some love at the Microsoft Virtual Reality developer event with regard to the HP Reverb G2 with the Valve Index knuckle controllers: https://youtu.be/0Ik6fItkWEI?t=17061

Talked a bit about this issue specifically, but not in any specific detail; just mentioned that drift was a possibility. It was great to see that the developers of these devices are very aware and interested in the success of software like this.

pushrax commented 4 years ago

Thanks for the link and time stamp @monstermac77, he makes a good point in that video that any SLAM system does have a bit of drift over time. The Rift CV1 doesn’t use SLAM and with a good setup can be stable for long periods of time. It would make sense that the Rift S does have some drift over time as it does use SLAM.

I will note that a developer friend of mine played Beat Saber with the G2 for a few hours and the drift definitely wasn’t enough to be notable or they would definitely have told me (I asked). It also may be a YMMV thing, where there are differing drift rates depending on how much you’re moving around the space. You stay relatively stationary in Beat Saber.

It’s off topic for this issue, but we’re having a good discussion on drift here. There are three main ways to approach this that I can see.

  1. Make recalibration trivially easy. I think it’s already quite easy and can be done without taking off the HMD, but you do have to pull out your original controllers. It would be possible to make a quick-recalibration feature that works by having the user record a reference pose (for the hip tracking case, just stand up straight; for the hand tracking case, maybe T pose) and to recalibrate ask the user to enter the same pose. I’m not sure how precise this would be, most likely it will take some practice to use it effectively.

  2. Use an extra Vive tracker mounted to the HMD or somewhere else on the head to provide a constant reference. I don’t like this solution but it would definitely work.

  3. Use a computer vision algorithm on the HMD camera feed to locate other devices. It wouldn’t need to be extremely robust since it’s just needed occasionally to correct drift. This would be pretty neat but the most work to maintain.

If anyone has other ideas I’d love to hear them.

monstermac77 commented 4 years ago

@pushrax no problem. Curious if we could append something about "drift" to this issue, but may make sense to open a new one and port our comments over because this is definitely off-topic.

Very unsure whether this would help by providing a stationary frame of reference, but could a webcam be used in any way? I'd imagine most webcams on desktop PCs are stationary, or could be made to be if end users know that it's being used to prevent drift. I also imagine the computer vision object recognition for a person with a giant thing strapped on their forehead would be easier than recognizing objects using a low resolution camera that has 6 degrees of freedom (in your point 3.). That said, I would think depth determination would be difficult using a webcam, so perhaps this would only allow for calibration on two axes, but I think your point 1. may suffer from the same issue if I'm not mistaken. You'd really need to have a spot marked on the floor that you can stand on in the exact same way each time to get everything lined up.

One thing I've found helpful when lining up between two virtual spaces is finding a reference point that is easily reproducible, like a corner in the room. When I want to sync multiple spaces, I go into a corner, stand up straight, and put my back flat against one of the walls. This is easily reproducible and would potentially offer 3 axis calibration? You'd just need the end-user to do this once with their native controllers (e.g. WMR) with their hands resting right next to their hips, and then each time a recalibration is needed they would go to that same corner and rest their hands in the same place with the non-native controllers (e.g. Index knuckles).

pushrax commented 4 years ago

That last point is a good one. Even just setting the controller down at a particular spot on the floor would be a decent option, assuming most of the drift is translational and not rotational (I’m not sure if that holds or not). Need some data there.

What I was thinking for the HMD camera algorithm would involve trying to cheat as much as possible by making assumptions about the nature of the drift, and the positioning of the camera. I’d rather pursue a simple and easily maintainable solution though.

monstermac77 commented 4 years ago

Even just setting the controller down at a particular spot on the floor would be a decent option, assuming most of the drift is translational and not rotational (I’m not sure if that holds or not). Need some data there.

Interesting. Didn't think about rotational drift. I do think that a "corner of the room" may be the way to go. Assuming arm length/height doesn't vary day to day (very safe assumption) you get y-axis. Assuming the user's shoulder width doesn't vary much (also very safe assumption), you get x-axis and z-axis because they're pressed up against a corner of a room. I think it's pretty easy to line one's back up against a wall and rest one's hands naturally by their side (rotation/orientation vector). I think this may also have the added benefit over putting the controller at a fixed location because you don't have to exit the VR experience to do this form of calibration and you don't even have to take the controllers out of your hands (something not the most trivial to do with the Index controllers). Some users may want to take the headset off and position the controller exactly in the right position with the right orientation which would be difficult to do via passthrough mode (at least until the cameras on these HMDs improve), further breaking immersion and disrupting the VR experience. That said, if you only need one controller to calibrate sufficiently, not two, I think putting one controller in a corner of the room and asking the user to orient the controller in a certain way each time may work well too (especially if people are too fidgety when standing in a corner, resulting in some noise). Hell, you could even have them put each controller in opposite corners of the room to get a bit more data.

Either way, I like think calibration feels a bit more easy when you don't have to have two sets of controllers lying around all the time for when you need to do it. I would be interested to know if some offset between the two virtual spaces already exists naturally because you simply can't colocate two controllers in the same hand with the same orientation etc. when doing the original calibration as it currently exists? Perhaps this is something that could also be fixed by the "standing in a corner" or "placing your controller down in the same fixed point" calibration method.

pushrax commented 3 years ago

@monstermac77

interested to know if some offset between the two virtual spaces already exists naturally

Can you clarify this paragraph? I'm not quite following.

monstermac77 commented 3 years ago

@monstermac77

interested to know if some offset between the two virtual spaces already exists naturally

Can you clarify this paragraph? I'm not quite following.

Sure. Although, to preface this, I could just be misunderstanding the nature of the calibration that OpenVR Space Calibrator does, in which case this question may be that there is 0 offset.

I think what I meant here was that because two physical objects obviously can't exist in the same place, you have to have, say, your WMR controller slightly to the right of your Knuckles controller when you hold it in your hand during calibration (or slightly above, to the left, etc). Therefore, when OpenVR Space Calibrator is taking all of those samples, there's going to be an inherit offset between the two controllers because you couldn't get them to be perfectly colocated during calibration. This would presumably mean an inherit offset between the two coordinate systems (WMR and Lighthouse), albeit on the order of an inch or so, even immediately after calibration.

The rest of that paragraph is discussing how this tiny offset could be remediated with my "standing in a corner" method, because using this method you'd be taking turns, first holding the WMR controller in sample 1 and then the Knuckles controller in sample 2, and hopefully achieving a near-0 or closer to 0-offset between the two coordinate systems. If it's important that more than a single point be gathered during sampling, you could have the user run their hand up and down the wall in an arc with their arm fully extended.

All that said, I think the benefits of the "standing in a corner" method for calibration really wouldn't be the elimination of any inherit offset (since nobody is complaining about this), but its strength really is just that the user would only have to do this process once with their WMR controllers, then they could put them away in a closet somewhere and every time a recalibration was needed, they would just go back to that corner of the room and rerun with just their Knuckles controller in their hand.

pushrax commented 3 years ago

The relative pose between the two devices being calibrated is also calculated as part of the calibration process. The calibration algorithm is described in https://raw.githubusercontent.com/pushrax/OpenVR-SpaceCalibrator/master/math.pdf. This approach has proven to be robust when the tracking systems aren't drifting and have similar latency. (latency correction is possible too in theory, and many other improvements to the algorithm).

monstermac77 commented 3 years ago

The relative pose between the two devices being calibrated is also calculated as part of the calibration process. The calibration algorithm is described in https://raw.githubusercontent.com/pushrax/OpenVR-SpaceCalibrator/master/math.pdf. This approach has proven to be robust when the tracking systems aren't drifting and have similar latency. (latency correction is possible too in theory, and many other improvements to the algorithm).

Wow, fantastic stuff. As someone who studied pure math, I'm only slightly allergic to applied linear algebra, but am forced to recognize it underpins much of my favorite hardware and software (like Space Calibrator) :)

Given this is accounted for, I concede that my calibration suggestion really doesn't offer much other than not needing to keep your WMR controllers around at all anymore. But honestly, as I'm going into week two of using Space Calibrator daily and the fact that I haven't had to do a single recalibration, I'm not sure how important improving the recalibration process even is! Could be a different story for Oculus which may induce more drift, though.

I think really the only thing I'd look for is #30, since it'd be nice to be able to ignore the all lighting conditions, never worry about drift, and simply use a full lighthouse setup. Independent of Space Calibrator, it is rather annoying (and nauseating) to put on my WMR headset before it has located itself in the room causing my surroundings to move with me for sometimes several seconds. This wouldn't happen with #30 closed.