Open noorbeast opened 7 years ago
It's on my to-do list and will be included in the next version.
Awesome news...instant hero status with the VR motion sim community!
I am currently implementing that feature. And while doing so some question came up. I suppose motion cancelation should affect both rotation and translation? What is the supposed origin I should use to apply any corrective rotation? How does this usually supposed to work with motion platforms?
I am excited to hear that and very much look forward to trying your work out.
SimTools presumes a default for axis/force settings: https://www.xsimulator.net/community/faq/which-way-to-set-simtool-axis-movements.230/
How those are used for actual motion control depends on the simulator design.
A simple 2DOF has a universal joint under the seat which allows movement for roll and pitch, but that is also used for surge, sway and heave by manipulating the settings. Traction loss is a popular addition which allows side ways movement at the rear of 100-200mm each way from center. Actual surge is a bit rarer but allows a similar amount of forward and back movement.
Other designs of 3-6DOF tend to have a higher point for movement around the chest. The amount of movement depends on the actual design but includes pitch, roll, sway, surge, yaw and heave.
360 degree sims are pretty rare but do exist and can include pitch, roll and yaw axis movements.
So yes motion cancellation should affect both rotation and translation.
I released the first version with motion compensation support. It works by attaching a controller/tracker on the motion platform, and then setting the center of the motion platform in the dashboard overlay. It's large untested because I don't have a motion platform, so it may not work at all or be buggy.
I deeply appreciate your efforts.
I have asked for VR motion sim owners with a diverse range of rigs to test and report back here: https://www.xsimulator.net/community/threads/vr-motion-cancellation-time-to-test.10241/
Hi there matzman666 !!
I have the motion cancellation up and running at seems to work fine when sat on my chair moving the Touch controller around :)
I have now attached the controller to my 6DOF motion platform and when I start a game the motion cancellation stops working, I presume because I am using a CV1 ?
edit: Sorry I didn't even thank you for your awesome work :D I have not been able to use my sim for 2 months because of Oculus stupid runtime updates
when I start a game the motion cancellation stops working
Does the game your are starting directly use the Oculus SDK, or is it a SteamVR game? Due to the nature of the hack it only works with SteamVR games, not with Oculus SDK games. I tried several SteamVR games (some using the standing universe, some using the seated universe, to cover all cases) and it's basically working in all of them on my system.
I presume because I am using a CV1
As long as you are running SteamVR games it should not matter whether you are using the Vive or the Rift.
Ahh I only tried Dirt Rally and although its through steam it uses the OculusSDK as does Assetto Corsa :( , I will give pCars a try and see if that works.
Is there anyway you can port it into the Oculus side of it ? I have no idea how it works or even if you could
I have tried running pCars from steam in SteamVR and once the starts the motion cancellation stops and when I exit the game the motion cancellation starts working again.
Is there something I should be doing because I'm using the CV1 seems its just cancelling it out ?
I suppose Project Cars uses the Oculus SDK on your System. Does the SteamVR dashboard show up when you are in Project Cars and press the Home button? I had a look at the start options that pop up when you right click on Project Cars in Steam, and apparently there is no noticeable difference between the SteamVR and Oculus SDK start option, both start the game with the same parameters. They seem to have no purpose.
On my system motion compensation is working in Project Cars, but I'm using it with a Vive so it has no other option as to run with SteamVR.
Is there anyway you can port it into the Oculus side of it ?
Not possible for several reasons: (1) I don't own a Rift, (2) there is no documentation so I would need to reengineer the runtime, and (3) the Oculus terms of service do not allow it.
No steam dashboard whilst in game unfortunately :(
Looks like Oculus is a no go then, bit of a downer as it does it work really well in the dashboard. Those Vive users are gonna be happy though
Thanks again for your efforts.
You could try Revive. It maps all Oculus SDK calls to SteamVR calls regardless of what headset is used, so theoretically it should also work with the Rift.
Hi, I've been following the discussion regarding motion cancellation and am working on my own motion rig. If this project is successful I will be using my Vive system. THANK YOU for doing this. Without this driver the VR motion rig would not be workable. Now to my question: When determining the offset to use why not grab the coordinates of the headset and determine the offset? Basically have the driver in the seated position, facing forward, and calibrate the offset based on the distance between the headset and the rig mounted controller. This is much like what games use now to calibrate the seating position. This method should also allow the controller to be mounted anywhere on the rig.
Using the distance from the headset to the mounted controller/tracker to calculate the offset used for motion compensation is not a good idea . The headset does not represent the center point around which the motion platform rotates (is center point actually the right name, or is pivot point a better term, if anyone knows please tell me, would be good to know so that I get the terminology right in the documentation), therefore any calculated offsets would be wrong most of the time.
It's better to use the center point (or pivot point), which is exactly what I am doing (you need to enter it in the dashboard overlay). You can already mount the controller/tracker wherever you want on the motion platform, and it should (theoretically) calculate the correct offset. However, due to not owning a motion platform I wasn't able to verify whether my calculations do what they are supposed to do. That's why I need people with motion platforms to test it for me.
It may also be a good idea that I document the math used behind my motion compensation solution somewhere public so that other people can check it for errors or come up with optimizations. Will do that in the next days.
This is the beauty of using the separate tracker as opposed to making calculations based on a pivot point. You don’t need to know anything about the rig when you have a tracker. In fact using a pivot point as part of your calculations is a really bad idea as some motion rigs don’t have a single pivot point (the one I’m buying included). As long as you know the initial x,y,z position in space of the rig tracker, it's initial roll,pitch,yaw angle and the initial x,y,z position in space of the headset you can calculate the offset needed to reposition the headset to compensate for the rig movement. All you need to know is the distance and 3D vector to the HMD when you calibrate. The only reason to know the pivot point is if your NOT using a tracker but compensating based on known rig movement. This is exactly how the software for the next level racing rig works... you have to input the measurement between your head and the ball joint under the seat ... no tracker.
Lol, you are absolutely right. Here I am solving complicated equations, while the real solution is simple and beautiful. Sigh, sometimes I just cannot see the obvious.
And you don't even need the initial position of the headset. You just need the rotation offset of the tracker relative to its initial orientation and the distance vector from the current tracker position to the current headset position, then you can rotate the distance vector to compensate the rotation offset and you add the compensated distance vector to the initial tracker position, and voilà you have the compensated headset position. Simple and beautiful.
It's not that my solution produces a wrong result (assuming a single rotation center), but it's unnecessarily complicated and requires more a-priori knowledge than your solution.
Haha... yeah I do that kind of stuff all the time. You will definitely need the initial distance to the headset during calibration and use that for your offset. You can't use the "current" distance between the tracker and HMD for the compensation or you will cancel out the actual headset movement of the user as well as the rig movement. You only want to compensate using the initial distance.
Ok got it working in revive and initial test is frikkin good !!!!
Just need to secure my controller better as it keeps moving and throwing the screen off a little.
Report back in a bit, with a video as well hopefully
Ok got it working in revive and initial test is frikkin good !!!!
That's good to hear!
You will definitely need the initial distance to the headset during calibration and use that for your offset.
What am I missing here?
We know the current tracker position T and its orientation QT, the current headset position H and its orientation QH, and the initial tracker position T0 and its orientation QT0. And what we want to get is the motion compensated headset position H' and its orientation Q'H.
To get H' I just need to rotate the distance vector between T and H by the difference QT0 - QT, and then add the resulting vector to T0. Then I should be exactly at H'. And to get the motion compensated headset orientation Q'H, I just need to rotate QH by the difference QT0 - QT.
I don't understand where I need the initial headset position here?
Hi back again it works brilliantly dude :) No tracking jumps, judder or anything and I was getting thrown around quite a bit.
Only little observation that was a bit off putting is the sides of the screens flickering where the compensation is a bit too much and the edges of the screen come into view, I presume that's because I have not got the Touch Controller attached exactly where my head is, its attached behind my head at the same level as the rift on my face.
Is there a way to lower the compensation to stop that happening ? Or do I put a offset in from the controller to my rift ?
Looky here https://youtu.be/G46rTN69IVM
ah hah! Eureka... yes I believe I understand. Thanks very much for the diagram. Your right you shouldn't need any calibration as long as it always holds true that there is never a need to compensate for the distance between T and H (the tracker and HMD). Now that I think further about it I can't come up with a reason this distance would need to be compensated. Perfect, simple, and elegant. Let me know when your able to code it up this way and I can run some tests on the Vive even though I don't have a true motion rig yet. It is on wheels so I can do some "manual" testing. So I'm curious how your accomplishing this generically. Your wedging yourself between the incoming sensor stream and the application? Thanks again!
Chris
Hi again,
So I got the coordinates of where my HMD is and I have added an offset from where the HMD is to my controller to the "DriverfromHead" offset is this correct or am I getting it all wrong ? Should I be setting an offset for "DriverOffset" this is just to compensate for the extra movement that the controller is getting because its further back so it pitches slightly more and it yaw's slightly more. Surge and Sway seem spot on
@SilentChill
Looky here https://youtu.be/G46rTN69IVM
Wow, that's quite a beast of a motion platform. What are the G-forces it can generate?
Only little observation that was a bit off putting is the sides of the screens flickering where the compensation is a bit too much
The motion compensation algorithm is not perfect yet, there's still room for improvements. E.g. I currently only adjust the headset position, but not the reported velocities and accelerations. This can mess up the pose prediction algorithm in the OpenVR runtime. The next step would be to also adjust velocity and acceleration, and see whether this fixes the problem.
I presume that's because I have not got the Touch Controller attached exactly where my head is
The placement of the controller should not matter at all, it should work nonetheless.
So I got the coordinates of where my HMD is and I have added an offset from where the HMD is to my controller to the "DriverfromHead" offset is this correct or am I getting it all wrong ?
The "device offsets" have nothing to do with motion compensation, you should not set any of them. And there is also no need to enter any coordinates, the "motion compensation settings" do nothing at all, they are there because of a logical fallacy and will be removed in the next version.
@imagebuff
Let me know when your able to code it up this way.
I will try to release a small update later today or tomorrow with the simpler motion compensation algorithm, and over the weekend I will work on also applying motion compensation to the velocities and acceleration values.
So I'm curious how your accomplishing this generically. Your wedging yourself between the incoming sensor stream and the application?
I intercept the pose updates coming from the driver before they reach the OpenVR runtime. I'm basically wedging between the device drivers and the OpenVR runtime.
Thanks for everything matzman666. I look forward to the update.
Just reading through this to understand the mechanics of it all. @SilentChill i think the edge flickering you are experiencing might be due to the Revive performance, not this input emulator. What you are referring to sounds like the Oculus SDK timewarp/spacewarp reprojection. Use a performance monitor and see if you are ending up at 45fps with reprojection instead of 90fps. This might help to do that: https://www.reddit.com/r/oculus/comments/4ovn8t/psa_how_to_see_fps_counter_in_cv1_and_other/
I released a small update with the improved motion compensation algorithm
@matzman666
Wow, that's quite a beast of a motion platform. What are the G-forces it can generate?
Enough to give me whiplash twice lol, the video really doesn't do it justice as you feel so much and it does fling you side to side.
The placement of the controller should not matter at all, it should work nonetheless.
It does work but the touch will be travelling a greater distance than what my head will be, but if you say its all good I go with you because its all black magic to me haha
Good to know about the offsets I'll stop messing with them. I'll install the update tomorrow and let you know how it is.
@SmartCarrion
Just reading through this to understand the mechanics of it all. @SilentChill i think the edge flickering you are experiencing might be due to the Revive performance, not this input emulator. What you are referring to sounds like the Oculus SDK timewarp/spacewarp reprojection. Use a performance monitor and see if you are ending up at 45fps with reprojection instead of 90fps. This might help to do that: https://www.reddit.com/r/oculus/comments/4ovn8t/psa_how_to_see_fps_counter_in_cv1_and_other/
It doesn't do it all the time only when my rig is making the big movements or stuck at a big angle. I will try what you say though just to be sure, thanks
I'd also like to express my appreciation for this code! Simply fantastic.
Although just using a controller to work out the offset is a good solution I (and I suspect others) would also appreciate a method for using a simple gyroscope with a pivot point we can set on the simulator. This allows both hands in VR games (oculus doesn't have a tracking puck remember) and is cheaper than a puck or third controller. This only works for 2/3DOF sims, but these are the most common since not everyone is as lucky as SilentChill :)
Also I would appreciate if you could add the ability to change the offsets via the commandline. My sim tilts the user backwards about 30 degrees into the seat as the 'zero position', but it would be great to slowly increase the offset so the user doesn't notice this as much... this can be automated if it works via the CLI
I just released version 1.0.3 with velocity/acceleration compensation modes. They are largely untested due to me not owning a motion platform.
@SilentChill
Enough to give me whiplash twice lol
It doesn't do it all the time only when my rig is making the big movements or stuck at a big angle.
This could be related. Please try the velocity/acceleration compensation modes to see if they fix the problem.
@traveltrousers
I (and I suspect others) would also appreciate a method for using a simple gyroscope with a pivot point we can set on the simulator.
The problem is how to get the gyroscope data into the driver. For me this makes only sense when there is a standardized interface I can use to get the gyroscope data.
Also I would appreciate if you could add the ability to change the offsets via the commandline.
Device offsets are currently not applied to the motion compensation reference tracker/controller, but it is trivial to add. I read your post after releasing v1.0.3, so I can only add it in the next version. However, correctly adding offsets is not that simple. E.g. when the reference controller/tracker is not mounted at the pivot point, any rotation also causes a translation of the reference, which you need to correctly calculate to get the desired effect.
Hi @matzman666 , thank you for implementing this feature. I'm currently working on a Unity project that requires motion cancellation, and I've been trying to implement it within the Unity project itself until I found your driver (thanks to the xsimulator guys) wich works great. My problem now is that once I assign the motion compensation mode to a tracker/controller, it appears as disconnected to SteamVR, and so it does to Unity. Therefore I can't attach any object to it or use it for any other thing that motion compensation. Is it possible to make the tracker/controller still visible while in this mode? Or at least to have the option to activate it?
The problem is how to get the gyroscope data into the driver. For me this makes only sense when there is a standardized interface I can use to get the gyroscope data.
If you're adding FreePIE support this would be perfect.... I can wait. It's worth pointing out that not every rift owner has touch...
Device offsets are currently not applied to the motion compensation reference tracker/controller...
Well when I alter the driverfromhead offset my sim moves in relation to it... still figuring it out but I think it might already be mostly there for my needs...
Once again, great work and timely updates are always appreciated!
I've been using my Vive controller as part of the feedback loop in my simulator, so the rotation of the controller tells my sim where it is. The control software runs in Unity3d but with the current implementation this data is no longer accessible by the native Unity.VR API or via the SteamVR CameraRig prefab.
Is it possible to make the normal data still available somehow?
I also have a suggestion to change either the icon or the colour of the controllers/trackers while they are in different device modes.
If someone wants a controller mount I uploaded mine to thingiverse : https://www.thingiverse.com/thing:2322016
Hey just chiming in to say thank you your development efforts.
I am just beginning my research into building a 2DOF motion platform for my Vive and flight sims and it occurred to me that the actual motion would be a huge problem. I was almost defeated until I remembered hearing something about a OpenVR Input Emulator on reddit. Then I come here to find out it appears to work and now I'm all excited
I'll be providing feedback as soon as I can get the ball rolling.
Quick question: It isn't clear to me from these docs here but: Is motion compensation possible via two (or more) Vive Trackers instead of using the controllers?
@matzman666, you the man!
Quick question: It isn't clear to me from these docs here but: Is motion compensation possible via two (or more) Vive Trackers instead of using the controllers?
Yes.... well you only need one.
Hi,
is it possible to edit some ini file manually? Because the dashboard menu stops working, if i change some values. I become the windows message: "Don't work anymore" In the programm folder, i found the "startdesktopmode.bat". But on the windows tool window, whatever button i press, that tool closes instantly (this time without error message). Could be nice, to know, how i can find a ini file. Or that the conf files, in the same folder?
I ask, because in my opinion, the problem is my pc. Because if have the same issue, with newer versions of advanced settings. Only a old version runs.
Could be nice, if you can help me. Much thanks! :)
@lluisgl7 @traveltrousers
I disabled the tracker/controller used as motion reference in SteamVR because it found it annoying having the tracker/controller floating around during testing, and I didn't think that someone might find the data useful. But it's no problem to change this in the next version.
@bastiuscha
What SteamVR version are you running? The newest SteamVR Beta (the one introducing SteamVR Home) changed some things internally which causes a crash in the overlay. I added a special installer (called OpenVR-InputEmulator-v1.0.3-SteamVRHomeVersion.exe) which should fix the crash.
Oh man, for a second i was so happy. Yes i use the beta with VR home. I installed your new version. And at the first start, it worked perfectly. Then i turned off, VR in steam. And since that, i have the same issue again. I reinstalled this tool 3 times, but the issue stay alive. I disabled revive and advanced settings...maybe it disturb...but did'nt help.
@bastiuscha
So it was working once, and then it started crashing again? That's strange. I have added a hotfix that completely disabled sound to see whether your crashes are caused by the sound issues introduces with the SteamVR Home update, or if it is something completely unrelated.
There is also a log file that may contain some relevant information. You can find it in C:\Users\
Now it works! I tested for 5 times (VR mode in steam on/off + restart windows. Always it works! But i have to say, i tested earlyer also without beta (vr home). That also did not work... But now, all is ok. What means "disabled sound" Your tool's sound? If yes: no problem... :)
Do you want a Logfile anyway?
Edit Here it is. :) https://www.dropbox.com/s/prfcixe1litoop9/VRInputEmulator.log?dl=0
@bastiuscha
Good to hear that it works now.
The overlay plays some sounds when you interact with GUI elements (e.g. clicking on a button). It's purely cosmetic and not in any way mission critical. I did this to more seamlessly integrate the overlay into the dashboard, and also used the same sounds Valve is using in their overlays. The problem was that Valve decided to move the sound files in the newest updates, and it seems that it was not moved for all users, or moved to different folders? With the first update I modified the sound file path to where they are now on my computer, but on some user's systems it was still pointing to the wrong path, so the safest was to just completely disable sounds. According to your log file the sound issue was indeed the cause of your crashes.
Ahh ok, i understand. Now i tested it in the practice...That is really awesome and brings VR + motionseat, to a new level. But i have a small concern. :) Is it possible for you, to implement a slider, with that the user can choose a exactly level, or something smiliar? Because i have the problem with vibrations. It could be nice for sure, if there is a slider, who the user can adjust the "blindness". I hope you can understand, what i mean.
Because i have the problem with vibrations.
Yeah, vibrations are not good. They mess up the IMU data, and when they are too strong then you have also visible vibrations on the tracker/controller. Did you try the Vel/Acc Compensation modes on the motion compensation settings page? They should help to reduce the effects of vibrations on the IMU data. You could also try using vibration dampers/shock absorbers for the tracker/controller mount.
It could be nice for sure, if there is a slider, who the user can adjust the "blindness". I hope you can understand, what i mean.
No, I don't quite understand what the slider should do.
@matzman666
I think he means having a sort of "threshold" slider that can be adjusted to prevent minute movements from being picked up.
Maybe a better option is a "smoothing" slider that interpolates the jittery movement into smooth, but less one-to-one, movement. Much like a mouse smoothing setting in games that is primarily meant to smooth out the movement of old low-sensitivity mice
@matzmann666 I have tested all three modes. Set Zero and Linear Approximation are relative immune to vibrations (not enough anyway), but i can't drive more than, 1 minute. The drive feeling is for me, like drunken. It's to hard for me. "Use reference Tracker" is good, but here we have the vibration sensitivity very strong. For me maybe a "disable roomtracking" button could be enough. I searched for a tool like that, for testing, but i did'nt found.
@sandivuk
I think he means having a sort of "threshold" slider that can be adjusted to prevent minute movements from being picked up.
I had not really an idea...this is why i said "something similar"(ok, the word was wrong written) :) Your smoothing idea is a good evidence. In case of a vibration, the movements have a fast direction change, with short way's. Maybe with this valuation, is a way to erasure the vibs. If this gives a to big delay, then i think, we have the "drunken" problem again.
Sorry for my bad english, is not easy for me to translate my mindstuff. :)
bastiuscha look at how well you've mounted your controller and strengthen that.... You could also move it closer to the pivot point since then you'll have less movement (while keeping it near your head).
This might be a mechanical issue more then software, show us your set up....
I think he means having a sort of "threshold" slider that can be adjusted to prevent minute movements from being picked up.
This is not easily possible in the mathematical model I am using.
Set Zero and Linear Approximation are relative immune to vibrations
This means the vibration problem is mainly caused by messed up IMU data.
(not enough anyway)
But to a certain extend the reference controller seems to vibrate itself. I can try to fix the IMU data, but when the reference controller vibrates itself this is way harder to fix. You may need to reduce the vibration.
The drive feeling is for me, like drunken
So the compensation modes are messing too much with pose prediction. I hoped that the influence of the compensation modes on pose prediction is hardly noticeable, but it seems I was wrong.
"Use reference Tracker" is good, but here we have the vibration sensitivity very strong.
Is there any noticeable difference between not using any compensation mode and this mode?
For me maybe a "disable roomtracking" button could be enough.
You want only rotational tracking? This is very likely to cause motion sickness.
It seems there is no way around implementing proper filtering. Let's see if a kalman filter can solve the problem.
@matzman666 is smoothing or interpolation movement possible in your model? I'm not sure what the polling rate is like but perhaps polling less frequently and 'guessing' the movement between one snapshot and the next could help reduce micro movement
@matzman666
You want only rotational tracking? This is very likely to cause motion sickness.
Probably your right, but i would glad to test it. For me it could be a compromise, instead to have nothing. But of course, a working motion cancellation is better.
Is there any noticeable difference between not using any compensation mode and this mode?
I will check and tell you the result. I tryed from beginning only with active modes.
It seems there is no way around implementing proper filtering.
I understand, that is not easy to filter that out. I'm a developer too (only games..hehe)...and i have my problems, to find a good solution, in my mind.
Let's see if a kalman filter can solve the problem.
Sounds interesting! I will wait for.
@traveltrousers
Of course, it is a mechanical issue. But with that, matzman have to count. I will not the last user, with vibration problems. The problem is my hardware...that is fact. I tryed different mount places... on times, i wear the controler like a necklet. :D I will try another positions and vib-absorbers.
Motion and VR compliment each other perfectly and with quality peripherals, transducers and sound are about the ultimate seated experience.
Unfortunately the lack of a generic motion cancellation solution, subtracting the rig movement from the HMD, is the real gap, particularly for large axis movement sims: https://www.xsimulator.net/community/threads/latest-oculus-update-totally-screwed-my-tracking.9917/
It would be easy to mount VR controllers to a motion rig and have instant tracking, as they are not really needed for anything else. But there needs to be a generic way to offset the rig motion from the HMD.
The motion community would be most interested in and really grateful for a motion cancellation solution and extending the OpenVR-InputEmulator project would seem a logical possibility.
You would find plenty of willing testers with VR and 2-6+DOF motion simulators at xsimulator.net.