Closed GKersten closed 7 years ago
For anyone else searching for this, rotating an object is quite easy with the IManipulationHandler:
public void OnManipulationUpdated(ManipulationEventData eventData)
{
float multiplier = 1.0f;
float cameraLocalYRotation = Camera.main.transform.localRotation.eulerAngles.y;
if (cameraLocalYRotation > 270 || cameraLocalYRotation < 90)
multiplier = -1.0f;
var rotation = new Vector3(eventData.CumulativeDelta.y * -multiplier, eventData.CumulativeDelta.x * multiplier);
transform.Rotate(rotation * Speed, Space.World);
}
Still wondering how to interact with it in Unity Editor.
I believe you need to also hold the left shift key to activate your hand while in the editor using the updated input system.
That's correct, I can see the small blue hand indicator moving, tried both left and right hand (with space bar). But I can not seem to trigger the events for the manipulationhandler when clicking, although I can see the cursor animating.
Make sure your scene is set up properly and take a look at HandDraggable.cs.
You might be able to take that class as a base example of hand manipulation.
Scene is setup correctly, prove is that I get it successfully working on the HoloLens itself, problem is only related to Unity Editor.
The HandDraggable has it's own implementation, using the InputHandler directly which successfully calls the events when running in the Unity Editor.
I have the same issue and it applies to INavigationHandler as well. I cannot get Unity Editor to work with manipulation and navigation using those interfaces. HandDraggable.cs doesn't use those interfaces.
I understand that HandDraggable doesn't use those interfaces, but I thought it would provide a good example of an implementation that could solve the problem.
@GKersten Would you be willing to share the code so we can take a better look at your complete implementation?
Have you tried setting a break point and stepping through it?
[Update] I tried recreating above snippet and can confirm that INavigationHandler
nor IManipulationHandler
is not getting called in Editor. The main reason is that in the InputManager.cs we are only registering input events that only get activated when we're on device.
Repro Steps: InputManagerTest scene does not throw the proper debug text from InputTests.cs when manipulating the cube and sphere in the test scene (Ensure Log Gestures is enabled in the inspector). I remember this working correctly when the updated input system was merged into main, but I must have been mistaken.
Maybe there's a way for us to pipe in mouse events when we're in the editor to handle these events appropriately? I'd suggest looking in EditorHandsInput.cs to get started.
After more investigation the only two options seem to point to calculating the delta of your hand movement in the editor manually in your custom script or to update the EditorHandsInput to handle the navigation and manipulation events properly. @maxouellet any thoughts on this?
All the editor input stuff is faked by the EditorHandsInput input source. This input source currently provides fake hand states, positions and airtaps,, but does not implement the Windows gestures such as Navigation and Manipulation.
If we wanted to support them, we'd have to update EditorHandsInput so that it properly fakes them, as @HodgsonSDAS suggested. Or it could be a new EditorGesturesInput whose role it is to fake gestures. The reason I didn't do it initially is because I didn't have the time to write a proper fake implementation for them. It shouldn't be too hard to do though.
FWIW, I have successfully used the manipulation gesture to move and rotate objects in some prototypes. It works relatively well, with the upside that it makes it easier to differentiate between a tap and a tap + drag. The only downside is that the refresh rate on these events is a little lower than the raw hands input: as long as you smooth the input, it should still work fine.
I second this.
Another problem which is related is that HandDraggable has its own implementation. I assigned a Fallback ManipulationHandler to my scene, but if I do a manipulation gesture on a gameObject that also has HandDraggable attached, both events are triggered.
If HandDraggable would implement the proper ManipulationHandler, I think this issue should be fixed. I'll see what I can do, but I can't promise anything now.
We've internally implemented the manipulation and hold gestures to EditorHandsInput. The code is relatively simple, but I've been pretty busy over the last few weeks, so not sure when I can bring this back... I can try to find some time this week or next week, but no guarantee :(
HandDraggable has its own implementation because it relies on the frame-by-frame updated position provided by the polling APIs (in RawInteractionsSourcesInput). You could use the manipulation gesture for this, but it has a few caveats:
@GKersten Can you please elaborate how did you use IManipulationHandler to rotate gameobject? Thanks. Can I use the same to rotate an object using pinch drag?
Before I file a new issue, is this what is also keeping the IHoldHandler events from firing in the editor?
Yeah, Hold is not currently implemented in the EditorHandsInput that is in HoloToolkit.
I'm really swamped right now, so I sadly don't have the time to validate this fix across all of HoloToolkit, but I've attached the EditorHandsInput.cs file that I am using locally. It adds supports for hold and manipulation. I haven't looked to see if this will compile as-is in the current state of HoloToolkit, but if not, it should be very close to what you need.
If someone wants to take it upon themselves to validate and submit it, feel free to do so :) Hope this helps!
Hey Max, looked this over. Seems to be upset about a missing reference for InputSourceEventArgs
.
@StephenHodgson Not surprising, I think that's because that class has now been removed in HoloToolkit following a change by @aalmada . Now instead of creating that event args, you just call inputManager.Raise* , where the wildcard is replaced by the type of event you want to raise.
Ahh, then that should be easy to fix. Thanks
@GKersten please test out this PR so we know it works well for your purposes.
OnManipulationUpdated only works when focus is on the gameobject, if the focus is lost it just stop working not even go to OnManipulationCanceled
@nipundavid You can add yourself as a modal input handler at OnManipulationStarted to get the follow-up events. I think that's what HandDraggable does, and what we've been doing in our various multi-frame interactions
@StephenHodgson I am little confused on how to test IManipulationHandler now in Unity Editor. Has the issue been fixed? If so, what are the instructions to setup Unity Editor for IManipulationHandler?
I am trying to use the IManipulationHandler to rotate a Sphere. I'd like to test this in Unity Editor, but I cannot seem to get this working by holding the click and move. Am I missing something or is this not implemented? The methods are called correctly when running on the HoloLens.
Also, did anyone already implement rotation of objects by using the IManipulationHandler?