microsoft / MixedRealityToolkit-Unity

This repository is for the legacy Mixed Reality Toolkit (MRTK) v2. For the latest version of the MRTK please visit https://github.com/MixedRealityToolkit/MixedRealityToolkit-Unity
https://aka.ms/mrtkdocs
MIT License
6k stars 2.12k forks source link

vNext Proposal: Default Mouse service settings should not capture the mouse in the play-in-editor window #3093

Closed Ecnassianer closed 3 years ago

Ecnassianer commented 5 years ago

Does this affect the legacy HoloToolkit (master) or the Mixed Reality Toolkit (mrtk_release)? Mixed Reality Toolkit

Is your feature request related to a problem? Please describe. I updated my copy of MRTK and now it captures my mouse when I hit play. I'm not developing a mouse input app, why does adding a Mixed Reality package make the mouse stick in my window?

Not having to fight to keep control of my mouse is really like the only thing I like about Unity over Unreal. Let's not bring the one thing I dislike about Unreal into MRTK! XD

Describe the solution you'd like The mouse service should be opt-in. The majority of people will just see this as a hassle, since mouse isn't the default mode of input for nearly all MR apps.

Describe alternatives you've considered

Additional context

Ecnassianer commented 5 years ago

Also, it's too hard to track down where in the profiles this is happening and turn it off.

david-c-kline commented 5 years ago

There is a proposal (not yet added in GitHub) to add a wizard-style configuration menu item. This would allow developers to choose to select specific features or to specify scenarios and have the toolkit configured appropriately.

The current configure option would then contain the most commonly used / key options.

cre8ivepark commented 5 years ago

+1 on this. Had to spend some time to figure out where this mouse cursor is coming from. mrtk-cursor

StephenHodgson commented 5 years ago

We're a cross platform development tool. It's expected to have mouse support.

StephenHodgson commented 5 years ago

There is a proposal (not yet added in GitHub) to add a wizard-style configuration menu item.

I'm also going to say no to this as well, as we specifically designed it to be a one click configuration. Now having different SDK's to import with custom made profiles tailored to your needs? Yes. That's the route we will take.

keveleigh commented 5 years ago

It's expected to have mouse support.

We can ship support for platforms and features without making them the default.

I'm also going to say no to this as well, as we specifically designed it to be a one click configuration.

No part of these proposals will say we should remove the one click configuration option. However, we shouldn't restrict developers from having options if they choose. We want to enable as many developers as possible, and some developers want more clear control over configuration and setup.

StephenHodgson commented 5 years ago

We're using computers. They need mouse support. End of story.

We want to enable as many developers as possible, and some developers want more clear control over configuration and setup.

1000% agree. That's why giving them options like tailor made SDKs based on their needs will keep the one-click install without burdening the user in a maze of wizards just to get started. Let's self this topic for now and keep focused on the OP thread.

david-c-kline commented 5 years ago

without burdening the user in a maze of wizards just to get started.

The wizards are not intended as a burden. Customers who wish to use one click configuration will continue to have that available. The intent is to extend the toolkit menu to provide customers who prefer a guided path to have that option.

The proposal should be completed soon. Hopefully it will become apparent the complementary nature in which it is being designed.

david-c-kline commented 5 years ago

We're using computers. They need mouse support.

Mouse support is fantastic. The availability is absolutely critical. What this proposal appears to be saying is that many customers do not require the mouse in their experience and that having focus taken by default is confusing.

Regardless of which approach is taken (mouse in the profile by default, or needing to be added), there is a discoverability challenge. The documentation is going to need to be very explicit on what extensions are defaulted in and how to scope them for the customer's specific goals.

This proposal should remain open while we gather feedback from the beta. Customers and data should guide the decision making process.

david-c-kline commented 5 years ago

keep focused on the OP thread.

OP thread?

StephenHodgson commented 5 years ago

Btw another reason we need to have mouse input on by default is for when we utilize the editor's Holographic Emulation. I just added a task. Without mouse input on by default, we won't get this support.

SimonDarksideJ commented 5 years ago

Ok, to take a different tangent here best on the experience we've drafted with the Toolkit to date.

By "Default", we always ship the toolkit with every service / feature and capability "ON by Default", that is the story we've been pitching from day 1. That means all SDK's we support are on, all inputs are on, all "features" are on by default. So in that mind, yes Mouse should be on by default.

Now, we continue the story with the next action a developer should take when building a project, to build their own configuration to tailor it to their needs. Again, something else we started to really drive home when we locked the default profiles (to ensure a developer always has an "always on" default position)

So exactly the same with all the other "on by default" features, we train the user to activate only what they need. I will follow up the quick start with another "how to configure" guide which should alleviate the concern here. I'll even use the mouse as one of the examples for things to turn on/off (which simply involves removing the mouse Input manager registered service as shown below )

image

So by this, yes Mouse should be "on by default" as with everything else and it's an education for a developer to pick and chose the features they want for their project.

Else we go back to a really old discussion and we have everything "off by default" with the net result of making feature discovery a lot harder and nothing works when they install the project. Which I feel would be the wrong direction.

In short, I think we should close this issue and focus on more / better developer / introductory documentation to help guide the developer down the path we want them to take.

cre8ivepark commented 5 years ago

@StephenHodgson I think the mouse in this thread is about the mouse 'cursor', not the input simulation using the mouse input(which definitely we need for easier development iteration). Eric described well already - now there is a mouse cursor in the editor and input is trapped. I need to press ESC to take control back everytime when I do iteration through the headset.

StephenHodgson commented 5 years ago

Ah I see. Well, If we don't capture the mouse in the editor then mouse input will not work properly.

Part of the mouse implementation details is that we use the mouse delta movement to understand how the mouse is moving, not it's location on the screen, so that we can continue to move the mouse while it's "off screen". In reality the mouse is locked to the center of the window, and the only way to get delta movement is if we preform this action. Likely a limitation by Unity, I'm sure.

Most developers who have been using Unity for many years, like myself, are pretty well aware of this limitation, and yes, I agree that it is somewhat annoying. Would I like to get mouse input without having to go through Unity's input system? Yes! but that also means we need to import some 3rd party plugin or something that is cross platform (But why do that when Unity takes care of it for us?)

cre8ivepark commented 5 years ago

@StephenHodgson I remember that HoloTookit's in-editor mouse simulation didn't have the mouse capturing behavior though.

StephenHodgson commented 5 years ago

Because it wasn't a true mouse input. It was faking the emulation. We want to utilize the Holographic Emulation as much as possible, but we also want to support the need for enabling standalone applications with the toolkit, and falling back to something useful if the person who installs the app doesn't have a VR headset.

We designed the toolkit to let users build their app once, and deploy to any platform without having to reconfigure everything. That means if a developer wants to build an app that is both VR and traditional 2d they can. That way they have a bigger market to sell to without having to rebuild their whole app.

cre8ivepark commented 5 years ago

I guess the main point is, current experience does not feel the best/optimized for the 'mixed reality' developers. Definitely I love the spirit of the true cross-platform solution, however, I believe mixed reality (AR/VR) development experiences should be optimized first. If there is a way to bring the simple fake hand emulation experience from HoloToolkit, it would be super helpful both for existing developers migrating from HTK and for the new developers.

StephenHodgson commented 5 years ago

I guess the main point is, current experience does not feel the best/optimized for the 'mixed reality' developers.

I agree, it could be better, and that's why we're having this discussion :)

mixed reality (AR/VR) development experiences should be optimized first.

Agreed, but we also need to consider the fact that not everyone will be targeting HL. So if we do bring hand emulation into the picture, that's not something we should prioritize either, which is why I suggested using Unity's built in Holographic Emulation as a way to facilitate that, which brings me to my next point.

If there is a way to bring the simple fake hand emulation experience from HoloToolkit

I'm def not in favor of just copying and pasting things from the HTK. There are lots of things that need to be considered before we just move things over, like the fact that most systems and services no longer derive from Mono.

cre8ivepark commented 5 years ago

Clarifying mixed two topics in this thread.

  1. Hand input simulation with mouse: Just went through #3096 . Now I understand, for the simulated hands, leveraging Holographic Emulation makes sense. The problem is, currently it only supports Xbox controller. If we can make it support hand simulation with a mouse input, that would be a great option.

  2. Mouse cursor display & input capture issue: For the development iteration experiences for the AR/VR devices, it would be great if we can make this turned off by default.

StephenHodgson commented 5 years ago

the problem is, currently it only supports Xbox controller.

Not sure that's true. Pretty certain it also utilizes mouse input.

StephenHodgson commented 5 years ago

it would be great if we can make this turned off by default.

If we turn it off by default then Holographic Emulation event forwarding will not work properly.

cre8ivepark commented 5 years ago

@StephenHodgson No, the mouse is not supported in Holographic Emulation. That's the biggest reason why I used HTK's in-editor hand input simulation. https://docs.unity3d.com/550/Documentation/Manual/windowsholographic-emulation.html

In Simulate in Editor mode, you need to use a game controller (such as an Xbox 360 or Xbox One controller) to control the virtual human player. If you don’t have a controller, the simulation still works, but you cannot move the virtual human player around.

StephenHodgson commented 5 years ago

If you don’t have a controller, the simulation still works, but you cannot move the virtual human player around.

That last part is key. That's why we need the mouse input by default. So we can forward those events.

chbecker-ms commented 5 years ago

@davidkline-ms - Let's add this to the agenda for shiproom tomorrow.

It sucks if we have to choose absolutely between enabling great mouse support and enabling productivity in the editor. Let's find a middle ground.

@cre8ivepark @Ecnassianer - It would help if you could come with a laptop setup that we could project to the Surface Hub that demonstrates your concerns.

cre8ivepark commented 5 years ago

@StephenHodgson Holographic Emulation works without MRTK. To add mouse input support in Holographic Emulation, it should be done in the Unity side.

StephenHodgson commented 5 years ago

I don't plan to override any plans for Unity side (and likely they won't add it). But I want to use the flag to know how to handle mouse input if it's enabled.

david-c-kline commented 5 years ago

@davidkline-ms - Let's add this to the agenda for shiproom tomorrow.

We ran out of time in shiproom. We should have a separate call to follow up on this discussion.

david-c-kline commented 5 years ago

not everyone will be targeting HL. So if we do bring hand emulation into the picture, that's not something we should prioritize either, which is why I suggested using Unity's built in Holographic Emulation

Since platforms can support controller based gestures, wouldnt building upon Holographic Emulation exclude those platforms? One of the great things about the new MRTK is the cross-platform support. It would be fantastic to extend this type of feature beyond HoloLens.

polar-kev commented 3 years ago

Closing since this is no longer relevant.