microsoft / MixedRealityToolkit-Unity

This repository is for the legacy Mixed Reality Toolkit (MRTK) v2. For the latest version of the MRTK please visit https://github.com/MixedRealityToolkit/MixedRealityToolkit-Unity
https://aka.ms/mrtkdocs
MIT License
6k stars 2.12k forks source link

AirTap and Manipulations and Shaking #499

Closed mgraszek closed 7 years ago

mgraszek commented 7 years ago

Hello

I want to use this Toolkit for my Unity project, but just today i upgraded my stuff to version 5.5. I see some changes in the code collide with my previous experience. 1) How do i do AirTap? Before i could add listener for GestureRecognizer and just add listener to it. Right now i can only attach some triggers only on GameObject. As a result, i cannot tap "nothing". 2) How do i make "Manipulations"? For example: In Kinect i could zoom in and out using both hands (not tapping, just recognizing pose and move with hands). Is it possible to do something similar with current Toolkit? 3) New Question: What about huge FPS drop when we are looking from very close distance (20cm or less). I receive similar results as here (https://forums.hololens.com/discussion/3045/drop-framerate-ridiculously-unity). I was able to improve shaking a little using Mobile textures (Except Standard), but its still not fix shaking problem.

Thanks for answer

StephenHodgson commented 7 years ago

A. 1 & 2: Please be sure to review the Input Documentation and take a look at some of the test scenes like InputTapTest or InputManagerTest.

Essentially any Monobehaviour you would like to place on your game object you'd like to interact with requires that it derives from one of the input interfaces.

A. 3: The issue you're talking about comes from the fill rate.

A great thread discussing this topic in the forum:

On the HoloLens, fillrate represents the greatest danger to performance. Since the HoloLens has to render everything twice (once per lens), it's a good practice to keep the number of holograms in view to a minimum. Make sure that you're not trying to render every single pixel on the screen. If the user gets too close to a hologram (where it starts to fill the screen), then fade it out before this can happen. Post-camera effects (a lot of the 'standard assets' in Unity) will also tank the system, so avoid this.

Remember we're developing for a mobile device.

mgraszek commented 7 years ago

Hi Thanks for answers. About 1): I've read the documentation a lot of time since it changed, and the only thing allowing me to AirTap is putting some invisible object next to me (far away) and "tapping" it, or add some method inside library class InputManager (not remembar exact name, but it was something similar) and listen when application recognize tapping.

About 2): I didn't get how IManipulationHandler works. Now i know :)

About 3): I understand the case, and its working fine for mobile devices (as normal android apk). I made a lot of tests with the same 3D model and i receive results as below:

I know that documentation says we should stay in distance 1m or more of the object, but its not always what we need in reality. Whats best practise to avoid shaking? Is it possible that shaking happens because of Post-camera effects?

StephenHodgson commented 7 years ago

1) All you need to get a tapable object is a collider and a custom MonoBehavior attached to it that implements IInputClickHandler, IFocusable, and IInputHandler. 2) Awesome! What you've learned here should also apply to above. 3) Really the only thing to do to avoid judder and shaking is to keep the frame rate at 60 fps and the graphics instructions low. https://developer.microsoft.com/en-us/windows/holographic/performance_recommendations_for_unity

On a side note, it could also be the speed in which we calculate the new object positions that might be the problem, not the rendering of said object. Maybe our values aren't getting updated at a high enough frequency. I wouldn't know where to look for that though.

mgraszek commented 7 years ago

About 1) Looks like you misundertand what i wanted to achieve. The thing is i want to tap an AIR. Air has no collider, no MonoBehaviour, no nothing. I add cursor, managers and camera and i want to recognize user tapping.

About 3) In my opinion main problem is focused on number of cameras. Any other device i know have the same number of cameras and views (for example Vuforia - has camera for left and right eye and its working fine to devices with AR technology). Holotoolkit override any other cameras in scene and provide only 1 camera (when its provided onto 2 views - left and right eye). When we are to close to object, 1 camera is not able to provide correct data for 2 views and thats why its shaking. It's similar with your own eyes, when you close 1 of them, you see that your nose is on your way and covers a little world from you, but when you have both eyes open, you don't see nose covering your view. I hope someone understand what i mean :)

StephenHodgson commented 7 years ago

1.) My apologies, thanks for the clarification. In that case I think you'll just need create a class that subscribes to the TappedEvent from the GestureRecognizer. Take a look at GestureInput.cs for an example.

This will fire whenever there's any tap.

mgraszek commented 7 years ago

Yes. You are right. I was searching today for it and i found it in some hololens tutorial (Fitbox) and there was good example how to use it. Thanks for all help.