Closed BenWoodford closed 7 years ago
It also brings up another interesting point - does the SDK system allow the individual SDK to announce what it provides so that it only shows up under the relevant SDK dropdowns? I haven't thoroughly looked into @bddckr's discovery update yet so perhaps that handles it by default, but if not then it may be something to look at.
That's already handled. We have Base SDK classes you inherit from. So if you subclass SDK_BaseController
that will only mean you write a controller SDK backend to use.
The SDK manager handles that. Go and remove/comment out something like the SteamVR Headset for example. Then look at the SDK manager. It won't show SteamVR in the Headset dropdown and also won't show that in the Quick Select showdown. (It will also not change your existing usage of that SDK if you happen to have specified it earlier.) It even explains why it does that:
Therefore supporting Locomotion Providers
is as easy as defining a new type SDK_LocomotionProvider
or something, then updating the SDK manager and associated editor. 😄
Oh very nice, thinking ahead :D
In which case, the discussion is simply - what should a locomotion provider actually provide? I'm still thinking this:
Hey, very cool that you're that open for this. What you propse sounds pretty good to me, it's basically what we'd need.
Many games that support gamepads don't actually use the analog value for speed, but have a dedicated button for sprinting. From a conceptual standpoint it IMHO makes a lot more sense to go with the analog value, but I just wanted to remark this. It's easy enough to implement in a translation layer if people actually want that.
In a second step, it would also be interesting to get player orientation for us, i.e. have some kind of communication stream in the other direction. This way we can adjust the direction if e.g. the player looks to the right while walking straight ahead. I'm not really familiar with the SDK, but I assume it's somehow possible to access the headset from the potential LocomotionProvider
?
Just as a side note, there are other RIP solutions to look at like Pocket Strafe.
There will probably be some variances in how SDKs work. Some may handle their own direction relative to where you look while others don't. Some may have variable magnitudes depending on how fast/high/hard you step; some may only have a single magnitude.
So we should make sure whatever SDK locomotion API we add has enough methods to support the different possible types of implementations.
Actually, side topic I just realized. We're talking about run in place style motion hardware right now. But, omni-directional treadmills are also a good candidate for implementing a locomotion SDK for.
Those, rather than an HMD relative direction will have their own coordinate space and an absolute movement direction vector.
Bi-directional communication is an interesting point. I would assume it wouldn't be too much bother but it may go against some ethos regarding VRTK's SDK layer perhaps as I don't think anything does that currently.
Some may handle their own direction relative to where you look while others don't. Some may have variable magnitudes depending on how fast/high/hard you step; some may only have a single magnitude.
Yeah, that's fortunately easy to sort with the Vector2, as those binary standing/moving ones can just use 0 and whatever value (we were debating in Slack about whether it should be normalised or not).
I'll pose it here - should the direction vector be normalised? And in which case, should we have a second realDirection that provides the direction/speed in m/s?
Bi-directional communication is an interesting point. I would assume it wouldn't be too much bother but it may go against some ethos regarding VRTK's SDK layer perhaps as I don't think anything does that currently.
I think there's no problem with that. There is no general design decision to forbid the SDKs to call back or something AFAIK.
I'll pose it here - should the direction vector be normalised? And in which case, should we have a second realDirection that provides the direction/speed in m/s?
I think normalizing would be a bad idea. I think it's totally possible for a LocomotionProvider
to figure out a way to distinguish between walking and running for example. Instead of splitting the direction vector like you suggested we can just normalize it wherever we need to.
Where does that leave the binary stand/moving locomotion providers though? I guess they'd just have to provide a reasonable "moving" magnitude? And then if a game wants to speed those up they'll have to deal with it there and then.
How about some kind of LocomotionAdjuster
that allows to adjust things the various LocomotionProvider
s offer? I think you'll want to adjust things anyway, regardless of binary or analog direction providers.
That could work yeah
I'll pose it here - should the direction vector be normalised? And in which case, should we have a second realDirection that provides the direction/speed in m/s?
I'm staring to think we may want 2 separate modes/vectors. Move in Place based SDKs and treadmills actually provide fairly different locomotion data.
Vector3.zero
.
VRTK_MoveInPlace
locomotion.To that end I believe locomotion SDKs should be capable of providing data for 2 modes with variable information, and I believe that Move in Place hardware locomotion should be integrated with a sister class to our existing VRTK_MoveInPlace
class.
maxSpeed
field to cap the speed you can sprint at and perhaps a multiplier that always defaults to 1f
but can be used to magnify motion in a way similar to the room extender.VRTK_MoveInPlace
class (both will inherit from a base class where things they have in common will be moved to) will be responsible for actually moving the player.
DirectionalMethod
handling options that VRTK_MoveInPlace
has (direction based on gaze, controller average, or a decoupling method)0f
(full stop) and 1f
(running at top speed). The MiP sibling class will use that data to move the player based on the existing speedScale
, maxSpeed
, and deceleration
values.I suggested a sibling class to VRTK_MoveInPlace
because there is heavy overlap between hardware based Move in Place and our software based Move in Place implementation. Hardware SDKs calculate it based of hardware outside the VR SDKs (sensors calculating physical movement of the feet or legs) while our software based implementation implements it with data it gets from the headset/controller SDKs (HMD headbob or controller armswing). However there are also differences: Hardware SDKs don't need to bother with the activation buttons that the software MiP needs. And the software always requires a direction method and a selection of control options. So it also doesn't make direct sense for the locomotion SDK to directly affect VRTK_MoveInPlace
or to require VRTK_MoveInPlace
which would demand a game give up an activation button even if it didn't want the Headbob/Armswing based locomotion.
However, there is one slightly awkward way we could implement this with a single VRTK_MoveInPlace
class. ;) Making our headbob/armswing locomotion code a locomotion SDK.
VRTK_MoveInPlace
would implement Move in Place handling for MiP mode locomotion SDKs.
engageButton
(we'll find a way to say this doesn't have to be defined if you don't want to support SDKs that require one)VRTK_MoveInPlace
would remain responsible for the speedScale
, maxSpeed
, deceleration
, fallingDeceleration
, and smartDecoupleThreshold
VRTK_MoveInPlace
VRTK_MoveInPlace
should call when the activation button is (de)pressedsensitivity
option for this SDK.This might be a little awkward because to my knowledge we don't have any SDK implementations that are themselves directly using the other SDKs. Also because to my understanding we haven't had to provide any global VRTK options for the other SDKs. We only have the SDK selector.
Personally I don't really see much point in conflating WIP with Treadmill locomotion, everything will be different other than the one line that applies the motion, and potentially your max-speed settings.
A WIP script should only have 3 dependancies, which is the position and location of the head, hands, and a reference to the playspace. All of which are already available through the SDK_Bridge, so I'm not sure what kinda of direct support is really required.
We have our own solution, called "FreeRunVr" it has 3 modes, Couch, Hybrid, and Full Walk, and we just stick it on the VRTK root and let it do it's thing. It measures the hands/head each frame, applies motion to the playspace.
I would just allow the user to attach a WIP script if they want, and if they want to support some treadmill in the future, that would be a separate script that they toggle off/on depending on detected hardware.
Candidate for the InputAction/Action<> stuff
VAY VR popped into Slack General to enquire about integrating their locomotion system into VRTK, which brought up an interesting point.
Alongside controller, boundary, etc. The SDK system should probably support locomotion SDK providers as well.
What that would provide is open to interpretation (probably just a vector for movement, perhaps jump and crouch 'buttons' as some systems may incorporate this at some point),
It also brings up another interesting point - does the SDK system allow the individual SDK to announce what it provides so that it only shows up under the relevant SDK dropdowns? I haven't thoroughly looked into @bddckr's discovery update yet so perhaps that handles it by default, but if not then it may be something to look at.
SteamVR for instance would be everything but locomotion, VAY VR would only be under the locomotion dropdown. Simulator would be all of them as you could add a locomotion provider for an analogue stick for basic movement then.
Thoughts?
CCing VAY VR's Sven for input - @svenstucki