Open GoogleCodeExporter opened 9 years ago
[deleted comment]
Issue 459 has been merged into this issue.
Original comment by mimm...@google.com
on 8 Apr 2014 at 5:27
Issue 446 has been merged into this issue.
Original comment by mimm...@google.com
on 8 Apr 2014 at 5:27
My Use case:
Data in my app can be entered by either accepting voice input (via the
recognizer intent) or by browsing and selecting stuff. If the user has entered
the app via touch, it would probably mean he/she prefers to browse (and not
talk) and can be sent directly to the browsing experience.
Original comment by tej...@gmail.com
on 8 Apr 2014 at 5:33
Use case: To keep from surprising the user, provide an indication about how a
Glassware application was launched (e.g. via voice, touch, or other). The
launched application could then be sensitive to "preferred" user interaction
style.
Original comment by joseph.a...@gmail.com
on 8 Apr 2014 at 5:36
[deleted comment]
Use case: In my app I would like to have two ways of selecting the data do be
displayed: Using voice input or browsing the touchpad. So, if the user uses
voice input, the app would launch the final activity, otherwise the user would
have to go through a couple of screens to select the data to be displayed.
Similarly to what happens with "Play a game with...".
Original comment by WLor...@gmail.com
on 6 May 2014 at 7:52
Original issue reported on code.google.com by
mcanzi...@gmail.com
on 28 Feb 2014 at 3:48