dotnet / winforms

Windows Forms is a .NET UI framework for building Windows desktop applications.
MIT License
4.28k stars 953 forks source link

Touch Support #2998

Open fahadabdulaziz opened 4 years ago

fahadabdulaziz commented 4 years ago

Is your feature request related to a problem? Please describe.

Lack of touch support.

Describe the solution you'd like

Add support for touch to System.Windows.Form.Control.

As I know there are three events could be used: Down Up Move

Describe alternatives you've considered

Will this feature affect UI controls? Yes

Thanks

weltkante commented 4 years ago

Adding some technical details. There are different levels of touch input available in the win32 API, so the first question would be which model to expose as events

fahadabdulaziz commented 4 years ago

It's possible to support both levels as:

Basic touch and built-in gestures are supported in Win7 but I don't know if there was any update to touch techniques after that.

merriemcgaw commented 4 years ago

Are there any missing basic touch support features? I can think of one request we had to pop up the onscreen keyboard when someone uses touch to enter an edit field. Other than that I think we get most of the day-to-day support we need from Windows.

If we are to take on a bunch of touch work, I think we'll need to see some specific scenario requests, including how an app would benefit from it.

weltkante commented 4 years ago

We often had requests to add swipe gestures or similar to our applications which go beyond the basic windows behavior. So far we generally reject these requests in WinForms parts of our application because it simply is too much work to program in native win32 interop.

In general touch support for custom controls beyond just treating it as mouse input needs going outside WinForms since there are no events exposed, so having those events would be great. I would put this as the major usecase for the issue.

fahadabdulaziz commented 4 years ago

There is no touch support. WinForms translates touch to click events!

We could benefit from touch gestures as touch monitors now everywhere like:

RussKie commented 4 years ago

There is no touch support. WinForms translates touch to click events!

It is how we support touch, so it is incorrect to say we don't have touch support.

It would help if you could give code examples in this issue. Specifically:

weltkante commented 4 years ago

Not speaking for OP, but our use cases are for adding touch support when implementing custom controls (both UserControls and Control subclasses would benefit) - not for adding additional touch support to existing WinForms controls. The common examples are:

Show real world scenarios that require one/two/five finger swipes, zoom and rotate gestures, and how they would otherwise be handled.

All our usecases would be satisfied by exposing the high level WM_GESTURE API, we do not need access to the low level WM_TOUCH API (see previous post above for links to the docs and overview over the APIs)

and how they would otherwise be handled

We either implement them in WPF or fall back to 3rd party controls who implemented the necessary interop already. In some cases it also means we have to reject customer requests for touch support; implementing touch-enabled custom controls ourselves in WinForms is currently to much work to be realistic for us.

Alltogether touch support in WinForms has not high priority for us, but it certainly would be "nice to have"

Code that shows the surface area of the API.

If OP doesn't have any specific scenario or doesn't desire to be involved in the API design process I could start outlining an API suggestion of how to expose the Win32 touch events. It'll probably be a longer process to design this API (you'll probably want to do some studies like you did with the other API suggestions), so if it were on me I'd classify it as 'future' and not for 5.0

RussKie commented 4 years ago

If OP doesn't have any specific scenario or doesn't desire to be involved in the API design process I could start outlining an API suggestion of how to expose the Win32 touch events.

This would be awesome, thank you!

fahadabdulaziz commented 4 years ago

Thank you @weltkante, and please feel free to talk off this suggestion, and I'll try to help :-)

I have this implementation in some of our projects

 public class TouchableForm : Form
    {
        #region Constructors

        public TouchableForm()
        {
            _touchInputSize = Marshal.SizeOf(default(TouchInput));
            if (!IsTouchEnabled())
                Debug.Print("Device does not support touch!");
        }

        #endregion Constructors

        #region Fields

        private const int TOUCHEVENTF_DOWN = 0x0002;

        private const int TOUCHEVENTF_MOVE = 0x0001;

        private const int TOUCHEVENTF_UP = 0x0004;

        private const int WM_TOUCH = 0x0240;

        private readonly int _touchInputSize;

        private int _touchPointsCount;

        #endregion Fields

        #region Methods

        [DllImport("user32.dll")]
        public static extern int GetSystemMetrics(int nIndex);

        public bool IsTouchEnabled()
        {
            _touchPointsCount = GetSystemMetrics(95);

            return _touchPointsCount > 0;
        }

        protected override void OnLoad(EventArgs __e)
        {
            base.OnLoad(__e);
            RegisterTouchWindow(Handle, 0);
        }

        protected virtual void OnTouchDown(TouchEventArgs __e)
        {
            TouchDown?.Invoke(this, __e);
        }

        protected virtual void OnTouchMove(TouchEventArgs __e)
        {
            TouchMove?.Invoke(this, __e);
        }

        protected virtual void OnTouchUp(TouchEventArgs __e)
        {
            TouchUp?.Invoke(this, __e);
        }

        protected override void WndProc(ref Message __m)
        {
            bool handled = __m.Msg switch
            {
                WM_TOUCH => DecodeTouch(ref __m),
                _ => false,
            };
            base.WndProc(ref __m);

            if (handled)
            {
                __m.Result = new IntPtr(1);
            }
        }

        [DllImport("user32")]
        [return: MarshalAs(UnmanagedType.Bool)]
        private static extern void CloseTouchInputHandle(IntPtr __lParam);

        [DllImport("user32")]
        [return: MarshalAs(UnmanagedType.Bool)]
        private static extern bool GetTouchInputInfo(IntPtr __hTouchInput, int __cInputs, [In, Out] TouchInput[] __pInputs, int __cbSize);

        private static int LoWord(int __number)
        {
            return __number & 0xffff;
        }

        [DllImport("user32")]
        [return: MarshalAs(UnmanagedType.Bool)]
        private static extern bool RegisterTouchWindow(IntPtr __hWnd, uint __ulFlags);

        private bool DecodeTouch(ref Message __m)
        {
            int inputCount = LoWord(__m.WParam.ToInt32());

            TouchInput[] inputs;
            inputs = new TouchInput[inputCount];

            if (!GetTouchInputInfo(__m.LParam, inputCount, inputs, _touchInputSize))
            {
                return false;
            }

            bool handled = false;
            for (int i = 0; i < inputCount; i++)
            {
                TouchInput ti = inputs[i];

                Action<TouchEventArgs> _handler = null;
                if ((ti.dwFlags & TOUCHEVENTF_DOWN) != 0)
                {
                    _handler = OnTouchDown;
                }
                else if ((ti.dwFlags & TOUCHEVENTF_UP) != 0)
                {
                    _handler = OnTouchUp;
                }
                else if ((ti.dwFlags & TOUCHEVENTF_MOVE) != 0)
                {
                    _handler = OnTouchMove;
                }

                if (_handler != null)
                {
                    TouchEventArgs _touchEvent = new TouchEventArgs
                    {
                        ContactY = ti.cyContact / 100,
                        ContactX = ti.cxContact / 100,
                        Id = ti.dwID,
                    };

                    Point pt = PointToClient(new Point(ti.x / 100, ti.y / 100));
                    _touchEvent.LocationX = pt.X;
                    _touchEvent.LocationY = pt.Y;

                    _touchEvent.Time = ti.dwTime;
                    _touchEvent.Mask = ti.dwMask;
                    _touchEvent.Flags = ti.dwFlags;

                    _handler(_touchEvent);

                    handled = true;
                }
            }

            CloseTouchInputHandle(__m.LParam);

            return handled;
        }

        #endregion Methods

        #region Events

        protected event EventHandler<TouchEventArgs> TouchDown;

        protected event EventHandler<TouchEventArgs> TouchMove;

        protected event EventHandler<TouchEventArgs> TouchUp;

        #endregion Events
    }

I some days I'll be back with scenarios from our projects.

YanisGANGNANT commented 4 months ago

@fahadabdulaziz what does TouchInput is ? I can't find any reference of it anywhere.

weltkante commented 4 months ago

You currently have to implement it yourself, Windows only provides the default mapping of touch to mouse input. WinForms has no special support yet. Adding that is whats this issue was about, but other things got higher priorities so nothing is available yet.

YanisGANGNANT commented 4 months ago

Thanks for your answer, I'll try to implement it myself then, and if I'm successful, I'll share it, because I didn't found anything anywhere.