What are the current limitations of hand input? Discuss or refute the shortcomings
It would be nice if the part about Apple’s 3D touch linked out to a news article or other source with more information on why Apple decided to introduce 3D touch, how much user testing they ran before doing so, and why they haven't done a better job of introducing users to it
You mentioned that Apple may abandon 3D touch in a couple years. Why does one kind of touch win over another? Do you have a reference/example of a device that uses Apple’s 3D touch? +2
Why is hand tracking not considered a gesture? +2
Why are pens in this chapter but not VR controllers, joysticks, gamepads, remotes, etc.? What makes something a hand-based interface?
How widely used is hand tracking?
Gesture classification is unreliable–what do products currently do to mitigate this?
Talk about the different ways touch inputs can be sensed, with diagrams for each
Have more mainstream examples of pen input being used beyond drawing/sketching
Include an image of a VR controller since not everyone knows what one looks like
Why is there not much desire for innovation in touch input in industry? Have we reached a place where touch is sophisticated enough to satisfy most users?
Include updates on pens, since advances have been made since it first came out +1
In this chapter and others, talk about how research is being done to teach users how to use new interfaces and support them through error recovery (obstacles mentioned in many chapters)
What’s the most appropriate response for when a user inputs with a pen and finger at the same time? (or other combo of valid inputs happen at the same time)
Would be nice to talk about the other categories in the same technical depth as touch (voltage flow between x and y axis, etc.)
How did touch screens make their way to cars? Doesn’t seem logical given the environment in a car
Chapter mentions pens aren't really used for text, but is missing the fact that some people, especially elderly rely on pens for inputting things like Chinese characters
The illustration for how touch screens work doesn’t help with the explanation
Explain how the technology behind the touchscreen means it can detect certain materials and not others (metal, fabric, etc.)
Almost all of these techniques have glaring issues in terms of their gulfs of execution. How does so much of this research not focus on usability, but instead only on how successfully they can perform controlled tasks. Is this why tutorials are so common? Is it possible to design techniques that are intuitive enough to not warrant any instructions?
Talk about ultrasonic haptic technology!
Are there brain scans of people who use hands with a tool versus with something like a stylus or keyboard? Does the medium (hand) have any affect on how we think, problem solve, feel?
Is there any value add for hand inputs now while we are still developing this area of research, instead of just speculative value or future utility?
Ask in the chapter readers to imagine what it would be like if they could no longer use their hands. Not just extreme edge cases, but also scenarios like temporarily not being able to use hands
Outdated things
Info about Apple pencil being only usable in some interfaces might be outdated–should work with any app in an Apple pencil-compatible model
Student says Apple 3D touch has already been abandoned!
Terminology:
What does “six degree-of-freedom sub-millimeter accuracy” mean?
Explain and introduce GSM
Define multi touch +1
Grammar and typos:
"movement and input, and rel more fully on the", 'rel' should be 'rely'
Talk about how gestures and hand tracking have improved
What are the key terms to know for the largest problem areas in this space, especially as MR becomes more mainstream? For example, occlusion and a deeper dive into it could help provide a starting point for readers who want to learn more
Include examples of alternative pens/hand tracking devices for people with disabilities, especially motor impairments, such as the limitless stylus
Talk more about limitations for people with visual and motor impairments, like a whole section
Talk more about what designing for ubiquitous computing entails
Student asks, "Why is there little appetite for innovative touch technologies? Is it just because of the gulf of execution? If that’s the case, how could we assume that people will ever get accustomed to and know how to interact with ubiquitous computing, or is touch/gesture technology not going to be the preferred method of interaction, but voice or tangible interfaces of objects that people already interact with? Because essentially, that interaction isn’t necessarily “natural” to humans, either. After all, people don’t touch walls unless they have been told to while perhaps kids touch walls too much."
Seems confusing that controllers are considered hand tracking, when it's really the controllers being tracked
Have visual aids for the new research being done since it's hard to visualize