Student wishes the chapter talked about some of the criticisms, dangers, or downsides of connecting the physical and digital worlds +1
Different student wants to know the benefits of these technologies
Why can’t people 3D print arbitrary shapes?
Are the bulky sizes and many wires and separate components of haptics due to the complexity of physical output, or just where we are in technology?
Talk about the trend of the maker movement and hobbyists experimenting with physical computing
Talk about how haptic outputs could work for things like the pandemic, when we want more touchless interactions
The use of the word “display” for the aeroMorph video is confusing, since the student normally thinks of displays as screens
Seemed strange to group haptics and 3D printing in the same chapter, since haptics is about feedback
How does accessibility play into physical interaction?
What about sounds or smells or taste? Can we morph materials for multi sensory interaction? +1
Would like to see your (Amy’s) predictions for the future of physical interaction
Have any of these research project been adopted at large scale to the mainstream market?
“Printing” and “Morphing” sections don’t seem to be related to interfaces and seem very different from “Haptics” +1
First two videos under “Haptics” were difficult to follow–could use more introduction in the text
This part could use more explanation: “For example, one project used electrical muscle stimulation to steer the user’s wrist while plotting charts, filling in forms, and other tasks, to prevent errors”
Introduce some concepts from Hiroshi Ishii at the beginning of the paper, since they provided some fundamental concepts around physical interfaces
How does AR relate to physical interfaces? Is AR considered inverse, since it takes the real world and makes it digital?
Wouldn’t focusing on making things physical be a waste of manufacturing? Why move to physical interfaces rather than digital ones?
Student says, “I want this chapter to expand on the notion of what we call our "human senses" and how through making the invisible, now visible via Morphing and Haptics, we can grow beyond our human senses, gaining a sense of ownership in the digital world. Perhaps by giving people a new sense, we can understand the mechanism of the adaptation and the neural processes by which we see the world differently.”
Why didn’t this chapter come before 2D and 3D output? Seems like that would make more sense
How does a printer actually turn bits into atoms?
iPhones/AirPods have haptic feedback that imitates button clicks. What specific research or category does this application fit into?
Do full-body haptics or “body suits” also fit into this chapter?
Talk more about foldable screens and how they lack a sense of value and how they’ve found traction in the marketplace
What’s blocking ubiquitous computing from becoming a reality?
Terminology:
Add annotated definitions of printing, morphing, and haptics
What’s the meaning of bits and atoms”/”turn bits into atoms”? +1
Why does making wool objects count as printing? How does it make the wool stick together?
Talk about 3D printing meat and other textured foods
Why haven't morphable, bendable screens made it to market?
How far should we go making realistic and natural feeling feedback in VR?
What was/is the vision behind 3D printing? Why make personal 3D printing possible?
What alternatives can be pursued to get around physical constraints?
Have examples of how physical and 2D compare and when one is better than the other for a prototype or product
Which of these examples of haptics have made it to the market?
Is there research on waste from 3D printing? What are the best use cases for 3D printing technologies? How can we mitigate risk of error when the error isn’t retrievable until it’s done damage to the end product?
What type of information/data is suited for what shape/form, and how do we find out that correlation?