yakshaveinc / tasks

distributed roadmap
The Unlicense
1 stars 0 forks source link

Adaptable non-verbal HMI for coronavirus quarantine #61

Open abitrolly opened 4 years ago

abitrolly commented 4 years ago

HMI - Human to Machine Interface. It is possible to create interfaces for humans to avoid contacts with surfaces, such as door knobs. keyboards, remote contols, touch panels etc. The interfaces should be mobile independent to service people without mobile phones (discharged, broken, etc.), work autonomously without provider clouds, simple to implement and debug for any person.

Adaptable non-verbal. The best illustration for it would be interface for controlling slides on presentation using camera tracking person arm movements (pose). Hand gesture recognition with fingers are not enough, because camera may be located too far to see the them. Non-verbal is important, because locating the source of sound is non-trivial, filtering it is non-trivial, and we can not assume that a person can speak clearly. Adaptable means that interface adapts to a person instead of requiring person to have a prior skill working with the interface. Interface provides a short interaction loop to calibrate itself to the person.

Requirements

The interface should be fast and give immediate response with unnoticeable delays. This is important for text entry in more complicated scenarios or for controlling equipment that can damage itself if the command comes late.

Simple to implement for hospitals and other areas with existing infrastructure. Color codes for camera, feedback screen and devices that move things can be printed directly on walls and surfaces. Each triple with include "camera", "info panel" and "actuators" create autonomous interaction loop.

Transparent. Without transparency the protocols will not be able to evolve, and people will not be able to track their evolution, see what's is going on and trust them. It may happen that both interface evolution and its protection will be AI/ML guided process. System should include friendly debugging capabilities even without XAI. Debug should not require specific engineering skills, specialized equipment. Ideally, debugging session should use the same interaction loop that is used to interact with a person.

Nice to have

Interchangeable. I may have calibrated interaction patterns before and want to share them with the system. Allowing it to read the patterns from a person.

abitrolly commented 4 years ago

User Story #1: "As a conference presenter I want to control my slides without touching anything." Slides can be uploaded beforehand using internet or transmitted from a device like RPi. RPi is placed in front of speaker (1-10 meters) with a camera watching speaker and optionally the projection screen. The interaction loop in this case is "RPi+camera" (camera) + "screen" (info panel) + "slides" (actuator). The screen is also the debug panel and gestures can be used for debug as well.

User Story #2. "As a patient I want to read my body temperature without touching anything". Conditional information access. Cameras can measure person body temperature. The problem is getting information out if a person is interested in it and not in something else. The person can draw a letter "t" in the air, or make a gesture that system does not recognize and ask the person what does it mean. Then a person connects the gesture to selection of actions and repeats the gesture. Next time the system recognizes the gesture and after confirmation just does it. Confirmation is removed when a person feels comfortable with the gesture and interaction loop/

abitrolly commented 4 years ago