Open sebescudie opened 6 years ago
Seems (but not sure), I guess renderer node marks windows message as handled, which means once it's done they will not get routed to something else.
In commit here (and build), I added this pin in renderer : Unhandle Touch Messages
https://ci.appveyor.com/project/mrvux/dx11-vvvv/build/1.2.0.17-alpha/artifacts
Try to set it to true and see if it solves the problem (it should still output messages and allow further processing)
Hey !
Thanks for answering this. I tested your build and no matter how I set the pin, touching the Renderer (dx11) outputs nothing on the Gesture node. The weird thing is that if I have a dx11 and an ex9 renderer open at the same time, Gesture outputs something when touching ex9.
Tested with b36 on a Surface Pro.
had a quick look and it appears that: "WM_TOUCH and WM_GESTURE messages are mutually exclusive. If you don't call RegisterTouchWindow, you will receive only WM_GESTURE messages."
from: https://msdn.microsoft.com/en-us/library/windows/desktop/dd693088(v=vs.85).aspx
already talked with @mrvux and he'll reuse the new pin to unregister touch events if gestures are required. will probably be after my pull request #339 and hopefully be in the next dx11 release.
@sebescudie if you need that immediately, call UnregisterTouchWindow in a c# plugin or so with the handle of the dx11 renderer, might work as a quick hack.
Thanks tebjan, I'll give it a try in the next days !
As Renderer (DX11) outputs Touch data, it is impossible to use the Gesture node with it (since you can't use both at the same time, apparently due to Windows limitations).
Could it be possible to have a config pin that disables Touch output so that Gestures can be used ?
Steps to reproduce :