ExtendRealityLtd / VRTK

An example of how to use the Tilia packages to create great content with VRTK v4.
https://www.vrtk.io/
MIT License
3.68k stars 995 forks source link

Allow Controller to directly interact with Unity UI elements #639

Closed getwilde closed 7 years ago

getwilde commented 7 years ago

Allow users to touch, press, and interact with GUI elements (such as buttons) via controller meshes... as an alternative to laser pointer.

Background info: As discussed on Slack, the emerging trend seems to be that if elements are within arms length, the user ought to be able to poke at and otherwise interact with them using their finger/hand (or similar custom controller mesh), rather than a twitchy laser pointer. (I've seen laser pointer cause confusion in my own usability testing.) Also, it was mentioned how hover events perhaps ought to be triggered when the controller is within a couple of centimeters, and @thestonefox suggested that a spherecast or capsulecast be used around the controller.

wildstyle007 commented 7 years ago

This would be great -- I've been considering this last few days also.. as I'd like to offer both options with my main UI.

So for example, could a small collider be attached the index finger of the hand model and used to trigger Unity UI on collision, would that be possible do you think? I suppose it's like being in permanent click mode, there would be no hover state... which is possible with the laser pointer.

getwilde commented 7 years ago

Hover state is more-or-less a freebie in Unity GUI, so it might be worth leveraging? I guess it depends on how we think button presses should work...

My personal thought is that it's probably desirable to have the GUI button press occur on trigger press. I know that's contrary to real world, but it essentially eliminates any chance of accidental GUI button presses. In the real world, you can brush against an elevator button without actually pressing it. But what's an equivalent in VR? I really don't want my user moving her hand and accidentally triggering everything she passes through (and then becoming terrified to move her hands lest she hit something).

So... maybe the user puts her index finger on/in/through the GUI button -- as though she's touching it -- and then presses the trigger to press the button. In my mind's eye, that feels appropriate. Incidentally that's also the same essential action as Grabbing and Using in VRTK... And in fact, once that's working, all of those actions could be set to the Trigger button, even further reducing the barrier to entry for non-gamer folks.

Just thinking out loud....

wildstyle007 commented 7 years ago

@getwilde -- your comments make sense. Agree that it may be better to use the controller trigger press to actually initialize the interaction.

Where things start getting a little confusing (for me at least) .. is say you are holding something in your right hand and need to interact with a object holding in your left hand. I have this use-case in my project now.. so the user is spraying on the wall and they want to change the color for example. In this instance then perhaps the laser pointer is still the best option.

getwilde commented 7 years ago

I've been thinking more about this interaction. Today Nathie released a review of London Heist, showing some interactions with buttons (GUIs?). Two things I noticed: 1) The hand model is aware that it's close to a button/GUI element and changes pose to finger point, 2) the user presses the trigger to actually interact with the button... ...same way as to grab an object.

I think this concept is brilliant. It's straightforward and elegant, and requires only one button to point/poke/jab UI elements as well as grab objects. I think we'll start to see this mechanic more often.

https://gfycat.com/ResponsibleUnacceptableAmericanbadger https://gfycat.com/CreativeEmbellishedAstrangiacoral

Blueteak commented 7 years ago

+1 for this type of interaction system.

Would really allow for some more VR-centric interfaces while still leveraging all the strengths of the Unity UI system.

getwilde commented 7 years ago

Just found "Thread Studio" by Shopify. They did such a good job with it. Same idea... mesh hand pose changes based on proximity to type of control (ie "neutral" versus "ready to grab" versus "ready to poke"). There's a "hover" or "highlight" state when controller is colliding, but before trigger is pressed. And then to actually grab or poke, the trigger is used.

https://gfycat.com/BareAnyImperialeagle

(Not shown in this GFY, but you can hold the card deck in one hand while you point at it with the other hand. It all feels very natural.)

thestonefox commented 7 years ago

So essentially, we can say:

getwilde commented 7 years ago

I think it's like this:

.
.

Incidentally... for those last two, InteractableObject already behaves similarly:

cameronoltmann commented 7 years ago

Yes, this would be very useful. @getwilde's last comment makes sense to me as far as functionality is concerned. Using a customizable-radius overlapsphere (with option to set a source transform) for Adjacent, or something similar, would probably work well.

Would probably also make sense to add the Adjacent functionality for VRTK_InteractableObject as well.

cameronoltmann commented 7 years ago

PS: OMG YES DO IT NAO!!!!

wildstyle007 commented 7 years ago

@thestonefox -- are you planning to tackle this feature, any idea if / when you'll get started? Let me know if you need any support... I will if I can.

thestonefox commented 7 years ago

I am planning on doing it, but I can't say when. hopefully soon.

The problem with it being a hobby project is I never can guarantee time working on things.

wildstyle007 commented 7 years ago

OK, thanks for confirming -- in terms of difficulty, how hard do you think this feature is? Reason I'm asking is that I'm getting close to needing this feature in my current project as well, so am wondering if / how I can help.

tntfoz commented 7 years ago

This sounds fantastic and pretty much ideal for my anticipated usage case.

If possible could the trigger and hover features (mentioned by @getwilde) be optional? If the hover isn't required, could just touching an object actually action the click event (rather than pressing the trigger as well)? For push button interfaces (think a calculator) that would be a more natural fit for VR first-timers than having to click the trigger too.

One last request is the ability to be able to drag & drop interface elements onto others (with a snap to position mechanic) that fires events too. Is that do-able in this PR?

thestonefox commented 7 years ago

@tntfoz raise a different issue for dragging and dropping as it's a separate thing.

tntfoz commented 7 years ago

Okay bud shall try that now...

thestonefox commented 7 years ago

I'm not entirely sure the best way of doing it yet. I need to put my thinking cap on

thestonefox commented 7 years ago

I'm going to try and put some time aside to look at this next week

getwilde commented 7 years ago

Thanks @thestonefox. Can't wait!

thestonefox commented 7 years ago

I've had a bit of a crazy idea. may work, may not....

But the canvases get a Collider added to them so they can stop the pointers going through the canvas.

What if two new colliders were added (trigger colliders) that just listened for the entry of the controller and upon the controller entering it, it just turned on the UI pointer raycast (set it to Always On mode).

then the direction of the ray would naturally select the button as if you had pressed the activation button down.

Then a second collider closer to the button z level would then listen for the collision of the controller, and upon that collision it would be the same as pressing the UI Click button (in fact the second collider could be the existing collider)

So something like this:

image

wildstyle007 commented 7 years ago

Interesting idea @thestonefox -- definitely worth a try! Do you think this could work with more complex interaction on UI components, like sliders for example?

Also perhaps worth noting - I've had some pretty weird behaviors / quirks with object colliders and UI recently... so I'm not sure if adding a stack of them would cause Unity to go mental... guess we'll have to try and see. :)

thestonefox commented 7 years ago

I'm going to change how the UI canvases register for the UI Pointer as well.

At the moment the UI Pointer searches for all valid UI canvases and converts them. This is really limiting that all UI canvases have the same options that are set on the UI Pointer.

I'm going to have it so you have to apply a new VRTK_UICanvas script to any canvas that you want to interact with, this will then set up the canvas as done already.

This way you'll only need to add the VRTK_UICanvas to a script to make it compatible (at run time too) and to ignore it, you'll just remove the VRTK_UICanvas and it will no longer be a valid canvas.

This will also mean that you can choose which canvases are activated by collision and distance to turn on pointer can be different per canvas.

It also means the Ignore Canvas with Tag or Class can be removed, because you're implicitly saying which canvas you want to be on or off.

getwilde commented 7 years ago

This sounds great.

A couple of months ago, I ran into issues where a canvas wasn't reacting to UI_Pointer. Turned out to be becauseit wasn't activated at runtime. So i had to manually register it with UI Pointer. But then there was a timing issue where it couldn't be done on Start and had to be done in a coroutine.

Anyway... it did work, but this sounds much cleaner. :)

thestonefox commented 7 years ago

Yeah it's a much better idea. I've implemented it now, I'm just going to push it up on the PR with all the UI stuff.

thestonefox commented 7 years ago

Updated PR with new UICanvas script

https://github.com/thestonefox/VRTK/pull/680

getwilde commented 7 years ago

On a related note... there was also an issue where that script added a collider to the canvas, but it was way too thick. (My canvas scale was 1,1,1 so a 10-unit deep collider was enormous). I got around it by adding my own collider so the script didn't have to. Anyway, I wonder if a better approach would be to use one of those "required component" directives and just throw an exception if the developer hadn't added a collider?

Would this be a good approach generally? There have been a few times where I've been surprised by ingame behavior and had to step through code to discover that components were being added automatically. I dunno... I can see pros and cons to both approaches.

thestonefox commented 7 years ago

Yeah using the [RequiredComponent] can work better. the problem with the collider is that it needs specific positioning and the way it works is basically if you haven't added your own then it adds it and auto positions it for you.

If you use RequiredComponent you cant set defaults that the dev can then override.

Which means you'd set a default in the script that would always override your info because you don't know if the collider has been added via RequireComponent.

getwilde commented 7 years ago

Yes, good points re Required Component versus AddComponent. Thanks.

So I've been testing this PR. A few things I've hit:

  1. When using a custom controller model with VRTK_UIPointer component, there's an NullReferenceException thrown at SDK_SteamVR.cs line 216. To get around it, I commented out VRTK_UIPointer.cs, line 300. Not sure if that's bad?
  2. To orient the in-game hand to match my own in real life, the raycast angle is not ideal. I consistently miss the target. Here's a drawing. (Incidentally, I had two epiphanies while drawing this: 1) Why Valve chose that angle for the controller head, and why gun makers use that same angle 2) Why numerous VR experiences have a ray that runs parallel to the index finger, not the thumb... it's an "extended use, wrist comfort" usability convention. And can you imagine how messed up the hand model would have to look to point at the red angle? haha) image
  3. For horizontal surfaces above waist level, users may expect to use the face of their index finger to press. So a capsulecast might make even more sense. image image
thestonefox commented 7 years ago

What line did you comment out (line 300 doesn't exist for me anymore).

Also, you can now add a custom transform to the UI pointer (and world pointers e.g. simple pointer) that you can determine the position and rotation of the beams coming from the controller.

I need to probably try a capsule cast but my feeling is it will be very chunky and probably cause many mis-presses.

My current thinking is some way of rotating that custom transform in real time to be pointing the direction you care about (you can do it, it's just how do you do it generically that suits all use cases) but if you wanted to program it yourself then you could.

getwilde commented 7 years ago

Here's what I commented out: controllerRenderModel = VRTK_SDK_Bridge.GetControllerRenderModel(controller.gameObject);

Re CapsuleCast: I'm not too concerned about mis-presses, but that's because I was very worried about mis-presses (haha) and have chosen to require a trigger pull to Click. (For the same reason, my controllers pass through InteractableObjects, and only Grab or Use on trigger pull.) But yes, I can definitely understand the concern for devs who don't take that approach.

thestonefox commented 7 years ago

i'm guessing the reason that line failed is controller isn't a thing and therefore can't have a gameObject.

strange, it shouldn't fail, unless you don't have a ControllerEvents script on the same controller that the UI Pointer is on.

getwilde commented 7 years ago

You can replicate the issue in Scene 32. Just add an EventSystem in your heirarchy, and add a VRTK_UIPointer to Controller (right)

BTW, that custom transform on the raycast is slick.

getwilde commented 7 years ago

I have this working. Overall it's really cool, and feels more immersive.

A couple other observations. I don't know how important they are.

  1. It would be nice if UIPointer exposed a Click event so that it's easy to have a "Poke" animation.
  2. Right now, hand animations begin playing at the same time as elements highlight. As a result, the user feels like his hands are slightly sluggish. It's a weird sensation. I wonder if there's a way around it? My thought was to introduce a HighlightDelay setting on UI_Pointer and InteractableObject, so that the users hands appear to "predict" that they're about to hit an object (even though they just did)... and start to move into Ready pose just slightly before the GUI/object responds?

Not sure if there are any other use cases for either of these items but I wanted to throw them out there.

Really nice work, @thestonefox.

thestonefox commented 7 years ago
  1. yeah perhaps additional events thrown by UI Pointer could help.
  2. You can't really do a highlight delay because it's the Unity UI element highlighting based on what unity does. However, could probably do enable raycast delay that turns the UI raycast on a set time after you collide with the front raycast activator trigger, that "may" work?
getwilde commented 7 years ago

Item 1 would be great. Item 2 went over my head but that's because I haven't figured out how you implemented this yet. You mean, it's not just pure magic?

Another gotcha I discovered today: Sometimes InteractableObjects need a Poke pose (ie a 3D mesh button, or small object you want to Use but not grab). Likewise, GUI elements need a Grab pose (ie an image to be dragged and dropped). Suggestions? Maybe a small script on objects that essentially says "I'm an InteractableObject but poke-able" or "I'm a GUI element but grabbable"?

UPDATE: Maybe I just check for (IsUsable==true && IsGrabbable==false)? And leverage that new VRTK_UIDraggableItem you created.

UPDATE 2: On InteractableObjects, I suppose it's not safe to assume IsUsable will always equate to a poke. So perhaps a "InteractableObjectPokeableItem" script is best. (And there's probably a better term than Poke and Pokeable, haha.)

thestonefox commented 7 years ago

You could find out if a ui element is draggable by the on pointer enter event tells you the element you've entered, check that game object for a draggable component.

Perhaps that info could also be put into the event payload

getwilde commented 7 years ago

I mostly have this working now. As mentioned in Slack channel, it would be super-helpful if UIPointer exposed a Click event, which custom controllers could listen for in order to fire "Poke" or "Grab" animations as appropriate. To support both 'Grab' and 'Release' animations, it would be ideal if UIPointer also exposed ClickDown and ClickUp events (which happen regardless of ClickMethods enum value).

Thanks for all your work on this @thestonefox.

getwilde commented 7 years ago

I've been through this. Tried to test most everything. I think it's good?

One gotcha that devs might hit with GUI elements:

If a dev wants to do both, it will require extra work: listening to AliasUIClickOn (or Off), determining if a UI element was beneath it, etc.

Maybe devs can use Draggable instead, which does offer Start and End events. Or maybe UIClickDown and UIClickUp events will be exposed as part of enhancement 686.

At any rate, good work @thestonefox. 👍

thestonefox commented 7 years ago

@getwilde Would you say the PR is good enough to merge now?

getwilde commented 7 years ago

Yep, I think so!

nasirrehan commented 6 years ago

Wow, bit late (2 yrs) in the party, just getting onboard the VR development effort. I desire to be able to touch, press and interact with GUI elements (sliders, scrollbars, etc). Example 34 in the VRTK kit just show case the UI Pointer in action but not the scenario you guys discussed in this thread. I was able to get the button working by throwing in the interactable object script and adding a collider to the button but short on luck to get the slider/scrollbar to work. Appreciate any help with a working sample. Thanks

bddckr commented 6 years ago

@nasirrehan Discussions like these are better in Slack because you can instantly get answers instead of waiting on here. It's also way easier for troubleshooting in general. GitHub issues are only useful for proper bug reports with steps to reproduce in an example scene like the Issue Template requires.

Thanks!