ddnet / ddnet

DDraceNetwork, a free cooperative platformer game
https://ddnet.org
Other
510 stars 381 forks source link

Android ingame controls discussion #8359

Open Robyt3 opened 2 weeks ago

Robyt3 commented 2 weeks ago

Please discuss how the ingame controls for Android could look and work like in this issue first.

Technical requirements:

heinrich5991 commented 2 weeks ago

Do we know other games that port a traditional keyboard/controller played game to Android?

I guess emulators do that. They show on-screen controllers, I think.

Jupeyy commented 1 week ago

Could we send SDL events for controllers to reuse the controller logic? Or is that more complicated than handling it in our logic directly.

Do you already checked for multi-touch stuff? I guess it now needs a device id for pointer/cursor events.

Then the component could be pretty slim i guess, since it only listens for multi-touch events and renders some circles.

Also searched for a small lib to do exactly that, but didn't find anything.. would have guessed that is rather easy to implement in such a way that it can be cleanly abstracted for multi purpose uses (other game engines etc.)

Robyt3 commented 1 week ago

Could we send SDL events for controllers to reuse the controller logic? Or is that more complicated than handling it in our logic directly.

Probably easier to do it separately from the controller logic.

Do you already checked for multi-touch stuff? I guess it now needs a device id for pointer/cursor events.

Yeah, the new component would need to handle the touch and maybe also multi-gesture events specifically instead of using the normal mouse-related events. I would keep the mouse system separate, so it should be possible to use mouse and touch separately at the same time, since both are different events.

This is the information we can get from the SDL events:

typedef struct SDL_TouchFingerEvent
{
    Uint32 type;        /**< SDL_FINGERMOTION or SDL_FINGERDOWN or SDL_FINGERUP */
    Uint32 timestamp;   /**< In milliseconds, populated using SDL_GetTicks() */
    SDL_TouchID touchId; /**< The touch device id */
    SDL_FingerID fingerId;
    float x;            /**< Normalized in the range 0...1 */
    float y;            /**< Normalized in the range 0...1 */
    float dx;           /**< Normalized in the range -1...1 */
    float dy;           /**< Normalized in the range -1...1 */
    float pressure;     /**< Normalized in the range 0...1 */
    Uint32 windowID;    /**< The window underneath the finger, if any */
} SDL_TouchFingerEvent;

/**
 * Multiple Finger Gesture Event (event.mgesture.*)
 */
typedef struct SDL_MultiGestureEvent
{
    Uint32 type;        /**< SDL_MULTIGESTURE */
    Uint32 timestamp;   /**< In milliseconds, populated using SDL_GetTicks() */
    SDL_TouchID touchId; /**< The touch device id */
    float dTheta;
    float dDist;
    float x;
    float y;
    Uint16 numFingers;
    Uint16 padding;
} SDL_MultiGestureEvent;

https://github.com/libsdl-org/SDL/blob/a26d2e705d3e5c4ad1880b228b908cd3a2b9236b/include/SDL_events.h#L519-L551

So we should be able to differentiate the touch events of two different fingers.

Though we have to consider that the exact information available depends on the devices' touch screen driver. For example the pressure value for touch events can have very different values/ranges depending on the driver. For the UI, I would try to make it possible to move the mouse without clicking when the pressure value is low and only cause a UI click for higher pressure.

Then the component could be pretty slim i guess, since it only listens for multi-touch events and renders some circles.

Yeah, the component has to render the touch controls, handle touch events and feed the inputs into CControls or the client directly.

My basic ideas for the touch controls would be:

heinrich5991 commented 1 week ago

For the UI, I would try to make it possible to move the mouse without clicking when the pressure value is low and only cause a UI click for higher pressure.

Is that pressure thing commonly available? Also, taking a step back, is there a point in displaying a mouse cursor on Android?

  • one virtual joystick on the right side for aiming

Would it be feasible to simply touch where you want to hook/shoot?

Robyt3 commented 1 week ago

Is that pressure thing commonly available?

Apparently not (sample size of 1 phone :/ ), so we'll probably scratch that idea. The touch position, pressure and size values can be tested by enabling the Pointer location debugging tool in the Developer options.

Also, taking a step back, is there a point in displaying a mouse cursor on Android?

I guess not. I'm just not sure yet how much work it is to add "true" absolute touch support for the UI instead of trying to emulate a laptop trackpad with relative touch support. Using relative touch has some unfixable issues (cursor will be stuck on draggable UI elements and elements that lock the mouse position) if we can't use pressure values, so we probably have to go for absolute touch input for the UI in any case. I would not show the cursor if touch input is used or only show it briefly after the touch event. Mouse input would be completely separate from touch input, so you could still connect a mouse to the Android device and use that normally (which should show a cursor).

Would it be feasible to simply touch where you want to hook/shoot?

That would also be possible and probably the most intuitive to use on a touch device.

A potential issue with this, in particular on devices with small screens, could be that touching and holding where you want to hook might block the relevant part of the screen from your view.

Also several more questions: How to hook/shoot? One action can be performed by the normal touch. The other action would need a separate button. Should default touching use hook or fire? How would you aim before using the hook with absolute touch input? With a joystick you would first move the joystick to the desired direction and then press the hook button, but with absolute touch input I suppose you would either have to hold the hook button down first like a modifier or always position the cursor by first shooting.

heinrich5991 commented 1 week ago

Is that pressure thing commonly available?

Apparently not (sample size of 1 phone :/ ), so we'll probably scratch that idea. The touch position, pressure and size values can be tested by enabling the Pointer location debugging tool in the Developer options.

Ah. On my phone it seems to be available.

Also several more questions: How to hook/shoot? One action can be performed by the normal touch. The other action would need a separate button.

Perhaps a toggle button for what the first touch does. The second touch while hooking could shoot.