Closed hollowworldgames closed 4 years ago
How large exactly? When I upgraded the plugin the issue went away. It was rendering the skybox at about 50k for some reason.
@hollowworldgames Look at my example. You don't need the SteamVR plugin to be installed ONLY the OpenVR package from Unity package manager. This is the cleanest way to do it IMO... even if you use the new Input API.
This is a massive pain in the ass, wasn't input working in an earlier version too?
This is a massive pain in the ass, wasn't input working in an earlier version too?
Yes input was working in an early version and they took it away from under us.
I've had it. I'm dropping SteamVR/Vive support. I'm now recommending customers to buy Oculus or WMR headsets. Bye
I've had it. I'm dropping SteamVR/Vive support. I'm now recommending customers to buy Oculus or WMR headsets. Bye
That's pretty dramatic when multiple ways of easily supporting SteamVR have been shown in this thread.
I've had it. I'm dropping SteamVR/Vive support. I'm now recommending customers to buy Oculus or WMR headsets. Bye
That's pretty dramatic when multiple ways of easily supporting SteamVR have been shown in this thread.
Idk, I think the two points he brought up came the minds of many developers. Despite the effort that myself and others in this thread have made, it was still in my game's best interest to 1)Drop future SteamVR/Vive controller support and 2)Recommend using an Oculus.
Aside from this development issue, the Vive Cosmos headsets have had lacked controller support in games such as Skyrim VR (Where a workaround is needed) for years. We will probably go the same route, requiring Vive users to use custom bindings.
Additionally, most of our players are using Oculus headsets anyways. So while we will offer to keep existing content compatible for that population, new features are being developed using the XR Interaction system and I hope that OpenVR is fully supported in the future. I'm not abandoning Vive, just not prioritizing developing for it.
Or you can just get the input from the Legacy Input system that works like every other API on the face of the planet that never has these issues (for a good reason). Unless I'm missing something the OpenXR action system is over-engineering at its finest from what I can tell that OpenVR for some reason thought they should copy.
How do you reference this in another script? Using Valve Index, I am referencing XRInput in another script and trying to pull controller values into bools as a test. None of these return true despite me having the right controller on and pressing the trigger etc.
rightTriggerPressed = XRInput.ControllerState(XRController.Right).buttonTrigger.down; test = XRInput.ControllerState(XRController.Right).buttonGrip.down; test1 = XRInput.ControllerState(XRController.Right).connected;
only works with legacy vr and 2019.4. It doesn't work with the xrmanagement and steamvr store asset. I would use it but I have an issue with the skydome being rendered to close with legacy vr and understand the buttons on the knuckles are not all there or correct in legacy as well.
only works with legacy vr and 2019.4
It should work with anything as long as you remove SteamVR plugin from Unity and restart Editor. OpenVR only allows one Input system at a time for no good reason. Look at updated comment. I'll probably make a Git repo for this and add support for new Input system and abstract away OpenVR's abstractions and maximize compatibility.
legacy vr was removed in 2020.1
This has nothing to do with Unity. I'm using OpenVR directly. It doesn't matter what Unity does for the most part. Unless OpenVR removes legacy API (which hasn't happened) & Unity updates/links to that versions .lib it will work.
I managed somewhat to make thing work in Unity 2019.4 LTS and SteamVR Plugin 2.6.0b4 . I modified the code of the interaction toolkit so that the XR Controller takes the input from the InputDevice on all platforms with the exception of SteamVR where it takes the input from SteamVR_Action. I can't share the code here because I'm using it for a commercial project, but if someone wants some more details, I'm open to share them.
I mean, I made a hack that works for usual cases (interactor/interactables and detection of grip/trigger/etc) not in all cases, it is still a hack and it is quite dirty... but it makes me go forward with my job. I tested builds for PC (OpenVR) and also for Quest (Oculus-Android), input works in all cases, but you must build for x64 on PC.
I managed somewhat to make thing work in Unity 2019.4 LTS and SteamVR Plugin 2.6.0b4 . I modified the code of the interaction toolkit so that the XR Controller takes the input from the InputDevice on all platforms with the exception of SteamVR where it takes the input from SteamVR_Action. I can't share the code here because I'm using it for a commercial project, but if someone wants some more details, I'm open to share them.
I mean, I made a hack that works for usual cases (interactor/interactables and detection of grip/trigger/etc) not in all cases, it is still a hack and it is quite dirty... but it makes me go forward with my job. I tested builds for PC (OpenVR) and also for Quest (Oculus-Android), input works in all cases, but you must build for x64 on PC.
Any tips for modifying the XR Interaction toolkit to take other inputs?
I hacked the XR Interaction toolkit to take SteamVR inputs by adding a SetInputOverride function to InputHelpers.cs. Here's my modified file. https://pastebin.com/6WD9eN6D
So anything that uses InputHelpers in the toolkit should work. You just grab controller inputs from the SteamVR plugin and pass them to InputHelpers through that added function. When InputHelpers fails to get the input from XR it will fall back to those values you provide from SteamVR.
I hacked the XR Interaction toolkit to take SteamVR inputs by adding a SetInputOverride function to InputHelpers.cs. Here's my modified file. https://pastebin.com/6WD9eN6D
pass them to InputHelpers through that added function.
I'm using a different implementation for input, when calling public static void SetInputOverride(this InputDevice device, string inputName, object value)
, what do I pass for 'object value' and is there any dictionary for the 'value', 'inputDevice', 'inputName' entries so I can be sure I am passing it the right things?
For inputDevice, I am passing the UnityEngine.XR.InputDevice, but inputName and value are a bit vague to me!
Thanks for the help
I'm using a different implementation for input, when calling
public static void SetInputOverride(this InputDevice device, string inputName, object value)
, what do I pass for 'object value' and is there any dictionary for the 'value', 'inputDevice', 'inputName' entries so I can be sure I am passing it the right things?For inputDevice, I am passing the UnityEngine.XR.InputDevice, but inputName and value are a bit vague to me!
Look at the s_ButtonData array for the input name strings it's looking for. The value is simply the actual current value of the input. It can be either a bool, a float or a Vector2, which is why it's of type object so it can take all 3.
Hello all, in the end, I stripped away only the code of the hack and made this guide on how I solved it. I hope it can be useful for your applications as well. I wanted to make the XR Interaction Toolkit + Unity XR Plug-In Management work with SteamVR, so that I could make a program crossplatform between Vive, Rift and Quest.
You can find the guide on my blog: https://skarredghost.com/2020/09/25/steamvr-unity-xr-interaction-toolkit-input/ I hope it can be useful to someone of you!
@TonyViT
First of all thank you for this wonderful article. I struggled with that SteamVR Implementation and now its working. I have done every step of your guide, Normal Object Grab is working fine + with Pointer Grab and UI Elements interaction. But something is not working correctly. Im using the HTC VIVE Pro Eye. (Quest is working perfectly with your "Hack") For Example:
private void Start()
{
rightController = InputDevices.GetDeviceAtXRNode(XRNode.RightHand);
}
public bool CheckIfActivated(XRController controller)
{
controller.inputDevice.TryGetFeatureValue(CommonUsages.grip, out pressTreshold);
Debug.Log("Grip: "+ pressTreshold);
controller.inputDevice.TryGetFeatureValue(CommonUsages.trigger, out pressTreshold);
Debug.Log("Trigger: " +pressTreshold);
if (pressTreshold> activationThreshold)
return true;
else
return false;
}
Could help me what im doing wrong? Is it also possible to get the MenuButton + SecondaryButton(Below Touchpad) to work? Big Thanks in Advance!!!
Hey @Schumsta glad that the hack worked out for you... at least in part. :)
Most probably to make things work completely, you have to modify the code of my hack.
First of all, with the hack, don't use anymore InputDevices.GetDeviceAtXRNode(XRNode.RightHand), but take directly the reference to the XRController and then access the InputDeviceWrapper through it. the problem is that it is exactly the inputDevice class that can't access SteamVR, so you should go through my hack and use the InputDeviceWrapper through the XRController.inputDevice to make things work also on SteamVR.
Then, you see all those problems because I implemented a naive hack, with the basic features. The basic features are the ones provided out of the box from the standard SteamVR Input System. At a certain point in the hack, I made you do "save and generate" on the input system, without changing anything. But those default settings have just a dull trigger that has only on/off states for instance, and so is for the thumbstick. What you should do is:
public bool TryGetFeatureValue(InputFeatureUsage<Vector2> usage, out Vector2 value)
{
#if UNITY_STANDALONE
if (m_isSteamVR && m_deviceNode.IsHands())
{
if (usage == CommonUsages.primary2DAxis)
{
value = SteamVR_Actions._default.Thumbstick[m_deviceNode.ToSteamVrSource()].axis2D;
return true;
}
}
#endif
return m_inputDevice.TryGetFeatureValue(usage, out value);
}
This way, you don't only get discrete values from it, but you get the continuous values from the action that you have created.
I hope it is clear, otherwise tell me.
Thanks for your fast response. I got it to work with the Touchpad-Input as Vector2. But i dont know what im doing wrong with the SystemButton (below TouchPad). Normal Click-Event (SteamVR Input) -> Bool.
```
public bool TryGetFeatureValue(InputFeatureUsage
But i dont get any Values from the SystemButton.... The button is false the whole time. Big Thanks Again in Advance!
Hello all, in the end, I stripped away only the code of the hack and made this guide on how I solved it. I hope it can be useful for your applications as well. I wanted to make the XR Interaction Toolkit + Unity XR Plug-In Management work with SteamVR, so that I could make a program crossplatform between Vive, Rift and Quest.
You can find the guide on my blog: https://skarredghost.com/2020/09/25/steamvr-unity-xr-interaction-toolkit-input/ I hope it can be useful to someone of you!
Tony!!!! You're awesome man. Thank you verry much. It works great for me and the test project I'm doing. I'm really glad I anticipated potential issues doing a pre prod test project as this is a REALLY big issue.
Hello, Any news on that issue? I'm developping crossplatform VR games and I don't want to use the SteamVR input plugin. The previous versions of the plugin was OK with the new input system, why have you removed its support? I understand OpenXR is great but it's not in Unity yet.
@zite so you closed https://github.com/ValveSoftware/unity-xr-plugin/issues/9#issuecomment-700149945 because it is a duplicate of that issue here. But why on earth is this issue here closed then?
A single input system is definitely the goal. We'll either support or work on a plugin that enables vr developers to target one cross platform api. But, it won't be in this plugin.
@zite Hi, is this still the case? If so any word on when this might be available? Thanks!
There has been finally some official communications regarding OpenXR support in Unity: https://forum.unity.com/threads/unitys-plans-for-openxr.993225/ TL;DR: "We plan to have early previews of Unity’s support of OpenXR on some of these platforms as early as the end of this year (Unity 2020 release cycle)."
There has been finally some official communications regarding OpenXR support in Unity: https://forum.unity.com/threads/unitys-plans-for-openxr.993225/ TL;DR: "We plan to have early previews of Unity’s support of OpenXR on some of these platforms as early as the end of this year (Unity 2020 release cycle)."
About time!! Thanks for sharing!
I'm probably late to the party, and just echoing what others have said but if the aim is all about cross platform support, then please just support the Unity XR input APIs (https://docs.unity3d.com/Manual/xr_input.html). This is what cross platform support looks like. Right now I have to grab another 3rd party plugin (not even served by the package manager, bloating the repository), and pipe that input into another layer so that my app has a common API. The whole purpose of the Unity XR plugins ecosystem was to avoid doing exactlty that. The common api should be at unity engine level, not at open vr plugin level.
Just chiming in that support for the new Input System would be great—since I think previous versions of this plugin did support it, having it enabled for a stable release would be awesome, since newer plugins could be pretty far off.
The other solution seems to be writing a wrapper for SteamVR's input, which doesn't sound super appealing.
It's 2021 and I still need this system before I switch to Unity 2020. Do you guys need any help on implementation?
It's 2021 and I still need this system before I switch to Unity 2020. Do you guys need any help on implementation?
Use Unity OpenXR (not OpenVR) plugin
This is just ridiculous, both you and unity are just too proud to do a quick solution, it was working before, it's just detecting the click of simple buttons. You could add a custom unity input device wrapper that reads from your old input, this is taking more than a year already, and where's your beautiful open xr solution? Anyway, I know you won't even read this comment. I hope you solve it this year
This is just ridiculous, both you and unity are just too proud to do a quick solution, it was working before, it's just detecting the click of simple buttons. You could add a custom unity input device wrapper that reads from your old input, this is taking more than a year already, and where's your beautiful open xr solution? Anyway, I know you won't even read this comment. I hope you solve it this year
Use Unity OpenXR (not OpenVR) plugin
This is just ridiculous, both you and unity are just too proud to do a quick solution, it was working before, it's just detecting the click of simple buttons. You could add a custom unity input device wrapper that reads from your old input, this is taking more than a year already, and where's your beautiful open xr solution? Anyway, I know you won't even read this comment. I hope you solve it this year
Use Unity OpenXR (not OpenVR) plugin
Are you referring to this? https://docs.unity3d.com/Packages/com.unity.xr.openxr@0.1/manual/index.html
In the documentation it says that it supports WMR and Hololens 2 only...
This is just ridiculous, both you and unity are just too proud to do a quick solution, it was working before, it's just detecting the click of simple buttons. You could add a custom unity input device wrapper that reads from your old input, this is taking more than a year already, and where's your beautiful open xr solution? Anyway, I know you won't even read this comment. I hope you solve it this year
Use Unity OpenXR (not OpenVR) plugin
Seems like it's in early beta yet, but it's something. I tried and the controller models have offset when compared to the old system
This is just ridiculous, both you and unity are just too proud to do a quick solution, it was working before, it's just detecting the click of simple buttons. You could add a custom unity input device wrapper that reads from your old input, this is taking more than a year already, and where's your beautiful open xr solution? Anyway, I know you won't even read this comment. I hope you solve it this year
Use Unity OpenXR (not OpenVR) plugin
Are you referring to this? https://docs.unity3d.com/Packages/com.unity.xr.openxr@0.1/manual/index.html
In the documentation it says that it supports WMR and Hololens 2 only...
This has more info about the current state: https://forum.unity.com/threads/unity-support-for-openxr-in-preview.1023613/
It does support both SteamVR and Oculus as OpenXR runtimes, but is in preview and still has a bit to go.
Here is an agnostic solution for others running into this: https://github.com/VRStudios/Unity3D-XRInput
Here is an agnostic solution for others running into this: https://github.com/VRStudios/Unity3D-XRInput
To clarify, is this to be used together with this repo (Valve's OpenXR plugin) and also Oculus' OpenXR plugin at the same time?
To clarify, is this to be used together with this repo (Valve's OpenXR plugin) and also Oculus' OpenXR plugin at the same time?
Yes its to be used with this repo. Because Valve doesn't support Unity3D's generic/agnostic input system like Oculus does, it exposes OpenVR's existing Input system (thats modeled correctly & been around since day one) or Unity3D's Input system through its own generic/agnostic layer.
In short enable SteamVR/OpenVR input when publishing to Steam store or disable it & it falls back to Unity's input system that supports Oculus, etc
To clarify, is this to be used together with this repo (Valve's OpenXR plugin) and also Oculus' OpenXR plugin at the same time?
Yes its to be used with this repo. Because Valve doesn't support Unity3D's generic/agnostic input system like Oculus does, it exposes OpenVR's existing Input system (thats modeled correctly & been around since day one) or Unity3D's Input system through its own generic/agnostic layer.
In short enable SteamVR/OpenVR input when publishing to Steam store or disable it & it falls back to Unity's input system that supports Oculus, etc
I see, so that means I have to manually enable/disable it based on the platform (Oculus or Vive) I'm running it on, correct? the same build won't work on both platforms automatically until Valve gets around to fixing inputs in this plugin and we no longer need to use your repo. Is my understanding of the situation right?
thanks loads for sharing your solution btw, I'm desperately trying to figure out what I'm supposed to use right now to support both Oculus and Vive since none of the OpenXR stuff seems to be ready.
I see, so that means I have to manually enable/disable it based on the platform (Oculus or Vive) I'm running it on, correct? the same build won't work on both platforms automatically
Yes thats correct. While I could update & add support for this & auto detect what platform Unity has init, its actually a bad idea in general. If you have for Example Oculus store/services running but try to play a game using Oculus with SteamVR... Unity3D gets very confused & triggers something in OpenVR inputs layer to stop working. This situation is actually rather common so its best make builds & publish for a specific platform. Compile once run-everywhere is a failed concept in general for a lot of situations IMO.
If you're looking for cross platform support I would take a look at unity's openxr plugin. That supports both oculus and steamvr in one plugin. It is still in development, but there's a preview linked above: https://forum.unity.com/threads/unity-support-for-openxr-in-preview.1023613/
Also I will add OpenXR support to the agnostic Input layer later which avoids needing to change game code yet again to a new Input system. Native OpenXR input solutions are going to be far more verbose anyway.
The point of OpenXR is to be a singular platform agnostic layer. Otherwise people like you (and me at a previous company) have to continuously update their abstraction layer whenever any of the supported plugins have a change. If you happen to stop updating it for some reason, it may stop working (like mine did). All the major vr companies are moving towards OpenXR to solve these problems.
Also, utilizing legacy input as you suggest, finger tracking data will not be available to you as those functions require SteamVR Input.
But of course, you're welcome to make your own thing if our options aren't working for you.
If you're looking for cross platform support I would take a look at unity's openxr plugin. That supports both oculus and steamvr in one plugin. It is still in development, but there's a preview linked above: https://forum.unity.com/threads/unity-support-for-openxr-in-preview.1023613/
Naturally this would be the ideal plugin to use. But it's been released for a month and is still in preview only (Vive and Oculus aren't even considered officially supported devices, you can't deploy to the Quest, and users are reporting all sorts of bugs on the forums still), so it's barely useable. Unfortunately I need to start a project right now and don't have the luxury of waiting for OpenXR to mature.
If you're looking for cross platform support I would take a look at unity's openxr plugin. That supports both oculus and steamvr in one plugin. It is still in development, but there's a preview linked above: https://forum.unity.com/threads/unity-support-for-openxr-in-preview.1023613/
Naturally this would be the ideal plugin to use. But it's been released for a month and is still in preview only (Vive and Oculus aren't even considered officially supported devices, you can't deploy to the Quest, and users are reporting all sorts of bugs on the forums still), so it's barely useable. Unfortunately I need to start a project right now and don't have the luxury of waiting for OpenXR to mature.
Ah, that is unfortunate. Hopefully those issues will be resolved soon.
If you're looking for cross platform support I would take a look at unity's openxr plugin. That supports both oculus and steamvr in one plugin. It is still in development, but there's a preview linked above: https://forum.unity.com/threads/unity-support-for-openxr-in-preview.1023613/
Naturally this would be the ideal plugin to use. But it's been released for a month and is still in preview only (Vive and Oculus aren't even considered officially supported devices, you can't deploy to the Quest, and users are reporting all sorts of bugs on the forums still), so it's barely useable. Unfortunately I need to start a project right now and don't have the luxury of waiting for OpenXR to mature.
Ah, that is unfortunate. Hopefully those issues will be resolved soon.
Are there still plans to support controller inputs with this Unity XR Plugin from Valve? Or are we just going to have to wait for Unity's OpenXR Plugin to mature?
The point of OpenXR is to be a singular platform agnostic layer.
This is the argument people made for OpenGL or Vulkan. Its simply doesn't hold true. Of course it depends on your targets.
Not everyone uses OpenXR like Pico for example & if you're on Unity LTS OpenXR isn't a solution either. Also OpenXR is not structured in a simple way from what I've seen (unless mistaken its over engineered to solve problems people don't have such as rigging structures etc better suited as frameworks). Input APIs have been done correctly since the 80s (button bit-fields & axis values with extra meta-data if needed and thus IDK what ppl are thinking with this new stuff to be frank).
Also I made this specifically because people at work were complaining so much about how bad the input options are. So figured I'd just share what helped them.
The point of OpenXR is to be a singular platform agnostic layer.
Not everyone uses OpenXR like Pico for example & if you're on Unity LTS OpenXR isn't a solution either. Also OpenXR is not structured in a simple way from what I've seen (unless mistaken its over engineered to solve problems people don't have such as rigging structures etc better suited as frameworks). Input APIs have been done correctly since the 80s (button bit-fields & axis values with extra meta-data if needed).
I made this specifically because people at work were complaining so much about how bad the input options are.
I'm not sure how far along they are, but it looks like Pico is at least working on OpenXR support: https://sdk.picovr.com/docs/OpenXRMobileSDK/en/index.html
There is certainly UX work to be done around making OpenXR Input easy to use, but we believe it solves a lot of issues that developers have had. Namely: new controllers with different layouts coming out, development time reduction for developing against a singular api, and input actions that allow for users to control how they interact with your game (accessibility, preference, half broken controller, etc). That said, it is quite different than what we've been doing for decades, and changing something like that takes a bit of iteration. Hopefully you'll take a look again once Unity's implementation has matured.
Does that mean we shouldn't expect any updates to this particular repo to support inputs in the near future (next 6 months)? Should I just stick to looking out for Unity's implementation to mature?
While I do get hand positions and controllers all the buttons are false or 0. I am using Unity 2020.1 16b and beta 7. I am testing with an htc vive with wand controllers, and STEAMVR 1.13.10.