Inclusio-Community / json-image-metadata

A specifiction for expressing technical image metadata, with an emphasis on accessibility of data visualizations
2 stars 3 forks source link

Haptic Behavior #8

Open jvartanian opened 8 months ago

jvartanian commented 8 months ago

Starting an issue to discuss haptic behaviors in the spec.

From a Vital perspective, the current haptic_pattern proposal (e.g. [50, 100, 150] 250) seems reasonable to me. It aligns closely with our existing implementation on Android. Our iOS implementation does not support haptics, as iOS did not have an API for custom vibrations at the time of writing (they have since added the Core Haptics API).

For reference as the spec develops, here are brief overviews of the native Android & iOS vibration capabilities:

Android The basic Android vibration API takes an array of timings similar to the Web's, with a minor difference. The first item is the number of milliseconds to wait before the vibration begins, followed by the number of milliseconds to keep the vibrator on, then the number of milliseconds turned off, and so on. An Android implementation could execute the above example haptic pattern pretty easily:

// The second param defines the index of the pattern where the vibration will repeat, 
// with -1 disabling repeating.
Vibrator.vibrate([0, 50, 100, 150, 250], 0)

The Android vibration API also has the VibrationEffect class, which enables more robust vibration definitions, including waveform vibrations with varying amplitudes.

iOS The iOS Core Haptics API does not have a 'basic' interface for supplying a haptic pattern (using just an array of timings). To play a vibration, an array of CHHapticEvents must be constructed, each with a 'relative time' within the pattern (different haptic events can overlap). Similar to Android's VibrationEffect, a CHHapticEvent can have varying intensities (amplitudes). Due to the verbosity of the API, I will not be posting code, but a great example of how to map from an array of timings to CHHapticEvents can be found within the Flutter Vibration Plugin.

Conclusion With all that being said, I think an Android or iOS implementation (if these are required) could easily execute the vibration spec as currently defined. These implementations could also support more complex 'waveform' vibrations.

It is worth noting that a JS-based implementation will likely work just fine on Android in terms of haptics, but on iOS that could be a problem, as Safari does not support the Vibration API

shepazu commented 8 months ago

Further background from Jeff Croxell:

Just a brief followup from the Vital side. As mentioned previously, our "metadata" is mostly scattered between databases, mobile app, and image layers. This is part of our excitement with the SVG work you have been doing - we are happy to abandon this implementation in favor of the improved SVG spec. As such, it is not simple to just send a clear example. Apologies.

With that said, I have pulled the content from one of Justin's previous messages and added him to the thread. I have also attached the described separate SVG layer representations (though of a different image than the color maps below). Please let us know if you have any questions.

It is very simple, just a color → feedback mapping. Here are some examples from the existing Vital application:

Audio & Haptic Feedback Metadata { "1c9e72": "KtOSWOg8btEUthb4blg", "39ceb3": "KtOSLsWX-eFUCVNMn_d", "96ed34": "Kl55H6HljS9qePseAqo" }

The keys are hex color codes, while the values are IDs that map to the various audio and haptic patterns we have defined within the application.

Text Feedback Metadata { "77f9ca": "Nucleus", "8bd85b": "Perixosome", "9e70e0": "Lysosome" }

The keys are also hex color codes, but the values are the actual text that should be read aloud to the user.

When using Vital on a tablet, the application renders three hidden SVGs (one for audio, one for haptic, and one for text feedback). Once the user touches the screen, the application determines the pixel that is being touched on each of the SVGs. If the color of the pixel matches a color within the associated metadata, the application produces the appropriate feedback.

dna-structure dna-structure a dna-structure h dna-structure t

shepazu commented 8 months ago

This is a great overview of the different haptic APIs across device OSes. Frankly, I tend to be fairly Web-centric, so I didn't consider Android or iOS, and I think that was a mistake on my part. If this spec is to promote interop, it should work across platforms, and should allow all the richness available on different platforms, including variable amplitudes. Here are my initial thoughts to fix it:

1) Emulate the Android vibration API: Since their basic vibration API is richer than the W3C Web Vibration API (which is pretty primitive, by design), we can safely define the more complex behavior and "downsample" it to work with the Web Vibration API. Concretely, this means:

2) Add a fallback mechanism, so a complex pattern (with amplitudes, etc.) would be attempted first, falling back to a simple pattern for platforms that don't support the more complex pattern.

Thoughts?