Open baemoth opened 8 years ago
Are you talking about gestures? If so, @drifio has been working on something like this. Maybe he'll do a PR with some of his stuff? ;)
That's exactly it. One can hope, I wouldn't even begin to know how to go about doing this, but can imagine a billion scenarios where it would be useful.
Something like this: https://www.youtube.com/watch?v=_7Oof3IHGzc ?
I'm not sure how easy it would be to cleanly integrate my system with the toolkit, but I'll certainly look into it, plus if I were to do it, I'd like to implement it in such a way that the gesture algorithm should be interchangeable, at the moment mine is very hard coded with one particular algorithm in mind.
Very much so, but if abstracted enough, it could be used to call UI, start particles effects, summons game objects, etc. Doing this would allow for anything you could imagine, and would be a springboard (platform? framework?) very easy to extend to your own purposes.
I don't see this going anywhere in VRTK core right now, close?
I would also suggest closing this at this point since it's a very old issue and has not received a lot activity for a long time so does not seem too popular/demanded feature. It's possible that future Unity XRTK/labs team will provide some kind of gesture support out of the box.
Lets leave gestures open for now, because people still request this and it would be "nice" if someone picked it up
Does anybody have any gestures build (swiping up, down, ..)?
VR is a very physical medium. I was thinking about the best way to go about integrating hand movements for actions. phroot's game 5089 as something like this, it's used to open the inventory I think, and the left hand path does this for magic casting. There are however a lot more use cases for this (minority report style interface, brawler style movement combos, etc). Is this is in the scope of the toolkit, and how could it be simplified in both code and approach to make a good example?