AvaloniaUI / Avalonia

Develop Desktop, Embedded, Mobile and WebAssembly apps with C# and XAML. The most popular .NET UI client technology
https://avaloniaui.net
MIT License
25.72k stars 2.22k forks source link

Gesture recognizer doesn't work #14261

Closed martinrhan closed 9 months ago

martinrhan commented 9 months ago

Describe the bug

Gesture recognizer doesn't work

To Reproduce

Steps to reproduce the behavior:

  1. Create a new Avalonia destop application
  2. Add following code to constructor of MainView:
    this.GestureRecognizers.Add(new ScrollGestureRecognizer() { CanHorizontallyScroll = true, CanVerticallyScroll = true });
    Debug.WriteLine("1"); \\ Ensure messages can be shown in console
    this.AddHandler(
    Gestures.ScrollGestureEvent,
    () => {
        Debug.WriteLine("2");
    }
    );

Expected behavior

If "1" is shown in the console, "2" should be displayed when I perform a scroll gesture.

Environment

timunie commented 9 months ago
  1. 11.0.2 is pretty much outdated. Better to test using 11.0.7 or even use nightly builds
  2. Check DevTools [F12] Event panel to check which events were fired.
  3. Gesture is more or less a mobile thing. Seems like you are on Windows, so why not listen to ScrollViewer.Scroll event or PointerWheelEvent?
martinrhan commented 9 months ago
  1. Gesture is more or less a mobile thing. Seems like you are on Windows, so why not listen to ScrollViewer.Scroll event or PointerWheelEvent?

I want to make it work for touchpad and touchscreen.

maxkatz6 commented 9 months ago

touchpad

"Touchpad" is not a touch device in Windows. It's a mouse device in Windows. That's how windows reports it.

I am closing it as by design, since ScrollGestureRecognizer was designed to work with Touch or Pen input as it is reported by the OS.

damian-666 commented 9 months ago

this is a standards issue and if you can hack in what you need.. you have to for now. Its a long running problem, and in a unified framework like NetCore, intractable, apparently.

what's stupid is you can multitouch the trackpad, but with a stylus you cannot!!! i don't know how or why they swerved or why they wont sort this out.

https://w3c.github.io/pointerevents/ https://learn.microsoft.com/en-us/windows/win32/wintouch/guide-multi-touch-input?redirectedfrom=MSDN

In Monogame which I think uses SDL, you can poll the touch Collection on a background thread, i use github.com/prime31/Nez code as a start, made my own gestured, since i don't deploy the IDE WPF UI, i deploy only custom 3d and custom 3d UI, or its there's bloat.

I would like to put transparent Avalonia UI over my 3d Views though. my setup is more like Stride which is halfway through porting to Avalonia in there IDE and 3d visualizer , and it might have code or symbiosis to think about.

I do it off a separate UI thread. Use producer consumer/ pattern for any CPU/SIMD updated visuals like Bepuphsyics2 , so the GPU is free to do just render and UI. and avoid any sync. up to 300 fps, even 800 fps ( top iPhone touch response) .. I do everything that doesn't need the GPU from worker spawned in that thread content, copy visuals to a buffer and let my UI draw it whenever the monitor or the timers calls for a refresh, which is the drivers update rate. To do this i have to set timer resolution. Its a whole class of issues that the timers and sync are 16 ms by default thats a whole frame. The chatGPT can see you how to set timer res to 4 ms, if you need to.. its cause of a whole class of problems.

gets worse, Microsoft's own surface laptop studio or surface line you can operate without a trackpad or mouse , Problem is virtual trackpad, kb , mouse, lag and take up screen. But there is a nice stylus. However this pens works more like touch, not like mouse, and should work like a mouse, it more precise and less stressful than mouse or trackpad ( and those are covered up or unplugged )

since they regressed it, I cant advocate a Surface for multiplatform development otherwise it would be ideal. You can develop for phones by folding the laptop to a tablet. you have multitouch on your screen you can prototype for mobile on the desktop version.

I've and other have been battling them to address it for months, and 3000 complaints on STYLUS does act as a MOUSE anymore, DEALBREAKER) if you google it. the driver is looking like a mouse. no easy work rounds or add on hacks.

surface and each app has to fix it. since" Creators update " in windows. Now VsCode and DevStudio do not support the pen for select drag drop anymore, both recently regressed does nothing. In windows on DirectX if you don't implement say pinch zoom, in WPF it was coming through as mouse wheel messages.

tons of UI researcher back this.

this might represent an opportunity for Avalonia UI to "show them how its done"

It really thorny and seems stupid to me each app having to do this, as well as voice access and UI indexing . i'm in the Windows Insider and its bloated mess. im trying to influence with the Semantic Kernel people the UI product designers that push the devs to make this mess, like voice controlling as if through the keyboard and mouse ??? and the Phi small language domain specific model

to standardize some implementation of unified UI based on Avalonia/WPF declarative

this is getting a bit meta but im in the windows insider builds ( dont install it its so bloated and cant shut off features) ..

 I Could move this part to a discussion along with the 3d buffer , tabbed dialogs, isssues, but i'm still on  a holding pattern the AI/ NLP  stuff its moving too fast.   and the UI stuff too slow, in general , but we need both,   never minding if  blind , steven Hawking hands disabled,  , or wanting to have a life. 

offline voice Access is really promising ( with a Asio driver preamp and studio mic 100$, the pen is 250 and i'm not going to ink text, talking to my pc should and could  be StarTrek like )
   but a rage quit for me,  a
   and  hotwords and that need a priority listening DSP  
   , ill maybe raise a proposal if there is an opportunity there for Avalonia to "show how its done".   With semantic kernel and the LLMS it could be super intuitive to command query the UI, but touch and pens will have their use case.     i think MSFT is best positioned to generalize this.  they  have code feature indexing ( same as a start menu was) , really .. They spend the billions already floating openAI 

   The accessibility APIs should be first class and the implementation generalized with NATO letters IMO as hot completion.. That is what i proposed to them and they understood generally what i meant but its a mess of experiments do DONT insider builds!! you cant.   But ideally i should be able to set at a mac with this UI skin or manager installed on it, maybe a small dedicated mic  DSP some exist but none are good enough... and Speak to it, and not touch it.  I'm not blind but i'm going build and fingers numb from trying to save the world i guess, lol.  But really the best features and ideas and simple and it take little time if you are doing it right its less work than making a big compounded mess.  Which is what's happening now in windows 11, but its in active discussion in feedback, finally humans in the loop and hopefully they wil do the right thing.