Russell-Newton / MultiTouch

A lightweight touch gesture recognition library written in C.
https://russell-newton.github.io/MultiTouch
MIT License
9 stars 0 forks source link
touchgesture touchscreen

MultiTouch

Built with CMake Demo Powered by Emscripten Docs generated by Doxygen

Deploy Artifacts to GitHub Pages Checks: pre-commit Unit Tests

All Contributors

A lightweight touch gesture recognition library created in C as a part of Georgia Tech's Spring-Fall 2022 Junior Design program.

See the Demo!


Contents


Installation

Prerequisites

  1. Install build-essential to have access to make and gcc:

    sudo apt update && sudo apt install build-essential
  2. Install CMake:

    sudo apt-get -y install cmake

ℹ️ Windows development is possible with tools like Chocolatey.

Option 1: Include Source in Your Project

  1. Clone the repository into your project.

    git clone https://github.com/Russell-Newton/MultiTouch.git <Destination>
  2. Include the source in your project

    • If you use CMake, then in a CMakeLists.txt of your project, add the gesturelibrary folder of the repository as a subdirectory using add_subdirectory. Delete the section of gesturelibrary/CMakeLists.txt in the SKIP_TESTS if statement.
    • If you do not use CMake, include the files in the gesturelibrary/include folder and add the files in the gesturelibrary/src folder to your executable.

Option 2: Build Static Library and Link to Your Project

  1. Clone the repo.

    git clone https://github.com/Russell-Newton/MultiTouch.git
  2. Build the CMake project.

    cd MultiTouch
    cmake -S gesturelibrary -B build -D SKIP_TESTS=true
  3. Compile the library with make.

    cd build
    make
  4. Include the library when compiling your program:

    • Add -I...pathto/MultiTouch/gesturelibrary/include to your compile command.
    • Add ...pathto/MultiTouch/build/libGestureLibrary.a to your compile targets.

Troubleshooting

If build errors occur, make sure you have make and cmake installed and added to your path. Ensure that you have a C compiler like gcc. In Unix, make and gcc can be installed by running:

sudo apt update && sudo apt install build-essential

Other common build issues may be related to where the CMake build directory is located. Make sure you run make from within the directory created by running cmake.


Usage

  1. Include <gesturelib.h> and the header files for any estures you are interested in. For example, <tap.h> and <drag.h>.
  2. Adjust the gesture parameters in <gestureparams.h> to your desired values. The variables can be set at runtime, but will require the gesture library to be reinitialized after modification.
  3. Call init_gesturelib().
  4. Create an adapter for your touch input device. Adapters transform device input data into touch_event_ts.
  5. Whenever a touch is received, create a touch_event_t with your adapter and send it to process_touch_event().
    • If you want the library to determine which finger this event corresponds to, set event.group = TOUCH_GROUP_UNDEFINED.
  6. Recognized gestures can be obtained from the library synchronously or asynchronously.

Listeners

Listeners are single functions that accept gesture-specific data and have a void return type. They are called whenever a recognizer's state machine updates its internal state. A listener should be registered after calling init_gesturelib().

Example:

// main.c
#include
<stdio.h>
#include
<gesturelib.h>
#include
<tap.h>

void tap_listener(const tap_t* event) {
if (event.type == RECOGNIZER_STATE_COMPLETED) {
printf("Tap received at (%.3f, %.3f)!", event.x, event.y);
}
}

int main(int argc, char *argv[]) {
init_gesturelib();

// register the new listener
set_on_tap(tap_listener);

// rest of program
}

Design

Touch Preprocessing

After touch data has been transformed into a touch_event_t and sent to our library, the library will perform some additional preprocessing. If the event has its group set to TOUCH_ID_UNDEFINED, the library will determine which touch group it belongs to. If the device provides a touch group, the library will not assign one.

The touch group represents the finger a touch event was made by. That is, touch group 0 corresponds to events created by the first finger pressed, 1 to the second, 2 to the third, and so on.

Touch group assignment is determined by event type:

ℹ️ Group assignment ensures that fingers generate the same group as long as they're in contact with the touch device.

After the preprocessing has finished, a touch event is sent to every enabled recognizer in the order in which they were added to the library.

Recognizers

Gesture recognizers are built like state machines. They receive touch events and update their state. When the state is updated, they call on the registered event listener, if applicable.

Builtin single-finger gesture recognizers save data about every possible touch group that could be performing the gesture they recognize.

Builtin multi-finger recognizers are more complicated and store data about every possible group for every possible user id. User id is set by the data adapter and could be determined by factors like which device received the touch or where on the screen the touch was received.

⚠️ All touch events with the same uid will be considered as part of the same multi-finger gesture for recognition purposes.

Gestures

Gesture recognition starts with a base gesture: stroke. Any other gestures can be recognized by composing and performing additional processing on strokes and other composite gestures.

Stroke

Stroke is a simple gesture with a simple state machine:

The state updates are less important than the data that stroke collects. Stroke collects data on:

When creating more complicated gestures, having access to this data can be incredibly useful.

Multistroke

Multistroke is a multi-finger counterpart to stroke. All strokes with the same user id get grouped into the same multistroke. The first down event starts a multistroke, and the last up event for the user id ends the gesture. In addition to the information contained in each stroke, a multistroke also tracks:

Tap

To perform a tap, press down and release within a short time and without moving too much.

Tap is a simple gesture that contains information about where and when the tap was started and released. If the time between start and release is too long or the distance too great, the tap will fail.

Double-Tap

To perform a double-tap, tap twice in close succession.

Double-tap stores the same information as a tap.

Hold

To perform a hold, press down for a longer amount of time before releasing.

Hold stores the same information as a tap.

Drag

To perform a drag, press down and move your finger across the screen.

Drag tracks starting position, current position, and current velocity. Current velocity is retrieved in the same fashion as stroke.

Hold and Drag

Multidrag

Like multistroke, multidrag is a multi-finger counterpart to drag. The same logic that applies to multistroke applies to multidrag. It stores the same information as multistroke, but has a slightly different state machine and property calculations.

Multidrag is used for processing zooms and rotates.

Zoom

To perform a zoom, press down with at least two fingers and move them closer together or farther apart.

Zoom tracks how many fingers are involved in the gesture and an estimated zoom factor.

Rotate

To perform a rotation, press down with a least two fingers and revolve them around a common center point.

Rotate tracks how many fingers are involved in the gesture and an estimated rotation amount.


Release Notes

Version 1.0.0 (Latest)

Features

Future Work

Bug Fixes

Known Issues

Version 0.4.0

Expand for full details #### New Features * Zoom and Rotate split into their own gestures * Removed swipe gesture * Finished implementing gestures: tap, double tap, hold, hold and drag * Demo page updates: * Links back to home page * Communicates with library using new listener structure * GestureCanvas component now sets display text within Demo component * Folder structure overhauled #### Bug Fixes * Zoom and rotate gestures work with more than 2 fingers #### Known Issues * Zoom and rotate gesture occasionally marked as complete on the demo page when a single drag has been performed * Multi-finger double tap tests failing for unknown reason

Version 0.3.0

Expand for full details #### New Features * Functioning swipe and drag gestures * Minimally functioning zoom and rotate gesture * Gesture library compiles into .js and .wasm with emscripten * Functions exposed by Module object to pack and unpack library structs without needing heap DMA #### Bug Fixes * Faulty unit tests removed #### Known Issues * Zoom and rotate gesture only works with 2 fingers * 3+ finger zoom and rotate is planned for next sprint

Version 0.2.1

Expand for full details #### New Features * Included Flutter project to collect example gesture `.csv` data. * Pages deploy workflow modified to deploy web artifacts * Demo app: deployed to [root endpoint](https://russell-newton.github.io/MultiTouch/) * Documentation: deployed to [/docs endpoint](https://russell-newton.github.io/MultiTouch/docs) * Data collection: deployed to [/data-collection endpoint](https://russell-newton.github.io/MultiTouch/data-collection) #### Known Issues * Some unit tests SEGFAULT. These have been commented so the unit test workflow passes.

Version 0.2.0

Expand for full details #### New Features * Framework for recognizer files (header and c files) created * File organization updated * Doxygen document generator linked to library * Vue project environment set up * Demo webapp front landing page created * GitHub Actions workflow created to generate and deploy Doxygen documentation from [doxygen-config](doxygen-config) * Created prebuild and predev npm scripts that compile C code with Emscripten, allowing for use in the webapp * Created `build:run` npm script that runs `npm run build` and `live-server` #### Bug Fixes N/A

Version 0.1.0

Expand for full details #### New Features * Sprint 1 limited to research, no features created * Project is buildable #### Bug Fixes N/A #### Known Issues N/A #### Research Done * Specified input/output format for data * Specified library and architecture structure

Contributors

Russell Newton
Russell Newton

💻 📖 🚇 🚧
Wenjun Wang
Wenjun Wang

💻 📖 🚇 ⚠️
jrdike
jrdike

💻 📖 ⚠️
Iftekherul Karim
Iftekherul Karim

💻 📖 ⚠️
deborahsrcho
deborahsrcho

💻 🎨 🖋