Open jessegreenberg opened 3 years ago
Here's another issue with Slider "step size". If we're not clear about the meaning of "step" options, and don't carefully consider all forms of alternative input, we run the risk that some users will not be able to set some values.
As @emily-phet mentioned on Slack, PhET sometimes uses a keyboardStep
size that is larger than the mouse/touch step size, because it would be inefficient to use an alternative input device to move through so many values. GFL is an example, where mass sliders range from 10 to 1000 kg. Fourier is another example, which I'll elaborate on here:
In Fourier, mouse/touch users have 2 options for entering amplitude values:
The current plan is to NOT support alternative input for the keypad, and not make the keypad reachable via traversal. (My understanding is that instrumenting the keypad is a big task, and PhET has decided not to devote resources to that for this sim.). Alternative input will therefore require using the amplitude sliders to set amplitue values.
In https://github.com/phetsims/fourier-making-waves/issues/53#issuecomment-846284868, @arouinfar specified the step sizes for the Amplitude slider:
We also need to specify step sizes for the sliders and NumberSpinners. I reviewed the mass controls in GFL (NumberControl) and GFLB (NumberPicker) for comparison. In GFL, the standard step size for alt input is not the same as constrainValue used on the slider. I've chosen to do the same here too, otherwise, it would take 60 key presses to traverse the entire slider range.
Amplitude sliders:
Standard step (arrow keys): 0.1 Smaller step (shift + arrows): 0.01 Bigger step (page up/down or fn up/down): 0.25
Here's how that specification translates to Slider options in AmplitudeSlider.js:
keyboardStep: 0.1,
shiftKeyboardStep: 0.01, // finer grain
pageKeyboardStep: 0.25, // coarser grain
Note that the standard step (keyboardStep
) is larger (0.1) than the mouse/touch step (0.05 or 0.01). So a keyboard user needs to use the Shift key to reach the same values as a mouse/touch user.
According to @jessegreenberg, keyboardStep
applies to all forms of alternative input, while shiftKeyboardStep
and pageKeyboardStep
are specific to keyboard. So while keyboard users can reach all of the same amplitude values as mouse/touch users, that is not generally true for all alternative input devices. Non-keyboard alternative input devices will only support 0.1 increments, and those users will not be able to set all of the same values as mouse/touch/keyboard users.
@pixelzoom, are you thinking switch access like sip 'n puff, perhaps mouth sticks (though mouth stick users typically use their sticks on a keyboard), and perhaps swipe and tap on iOS with VoiceOver.
It might be nice to list out what inputs we are trying to cover - what do we think of as alternative input?
The arrow buttons in GFL's combined slider and number picker was indeed not needed for full non-visual access with keyboard.
For non-visual access with iOS VoiceOver there is a gesture, double-tap and hold, that allows users to slide their finger like a mouse, but we have yet to support that gesture on any slider. When learners use that gesture, there is no corresponding description.
Is that what you mean by some other forms of input?
We don't yet test with switch devices. Making the sims keyboard accessible takes us a long ways there, perhaps all the way to full switch access for many interactions.
If we interview switch users we may indeed find design issues.
I didn't have any specific input devices in mind. But the goal is presumably to design an API that accommodates more than just keyboard input. The current API for Slider has a problem in that regard - described in https://github.com/phetsims/sun/issues/703#issuecomment-852325705 - in that the API does not provide consistent "step" support for all forms of input (alternative and otherwise).
@terracoda I think @pixelzoom is trying to identify, and then to avoid or address, situations where the implementation is being input specific - or overly input specific - when it doesn't need to be. Not quite the same thing as tackling expanding a specific interaction design for a specific input method, but helpful in moving in that direction.
@pixelzoom Just trying to help bridge the discussion here - if I'm misrepresenting your efforts, apologies in advance!
Yes, precisely @emily-phet. Thanks for clarifying. I'm not suggesting that PhET needs to be be actively testing with or supporting other devices. But the programming APIs should be generalized (not device specific) where possible, so that supporting future devices is less of a burden. And the Slider API has some problems in that regard.
... and not to harp on the same issue. But https://github.com/phetsims/sun/issues/698 is an example where the Slider API is causing problems because it is unable to handle "step size" for mouse/touch and keyboard. The options for mouse/touch are currently at odds with the keyboard options. The root of this problem is that the options are device specific, instead of generalized to meet the needs of multiple types of input device.
I would love to generalize :-) We added keyboard steps because the original slider design was mouse and touch specific. Totally onboard for one thing that has options that handle the needs of all inputs.
The user experience needs are different for different inputs which makes it challenging to step back and see what we can do to support them all.
Love the discussion and the effort going into this.
Raising the priority of this, because I currently cannot meet the requirements specified for Fourier.
A workaround was added in #698, so this is no longer blocking Fourier. I'll remove the "priority high" label, and let the a11y team decide on priority and scheduling.
6/10/21 Fourier design meeting: @arouinfar @kathy-phet @jessegreenberg @KatieWoe @pixelzoom
Over in https://github.com/phetsims/sun/issues/837 @jessegreenberg and @samreid and I are really excited to actually bring mouse/touch and keyboard input further away from each other. It has often caused confusion to have overlapping implementations (constrainValue option) with different needs for each design/implementation. I would recommend closing this issue and instead treating both as different entities. While a bit redundant, it leaves room for the flexibility that we have all over the project.
@zepumph and @samreid will #837 support iOS VoiceOver users who double tap and hold to activate a slider and then drag the slider thumb (without lifting their finger) to change the value of the activated slider? A similar gesture would be desirable in our own custom gesture input that was designed/prototyped for haptic research in JT and GFLB.
People, especially those with low vision, do not use a single form of input. They mix it up switching between discrete focus-based input (e.g., the keyboard) and non-discrete pointer-based input (e.g., mouse and touch).
Some forms of input are focus based (e.g., keyboard, iOS VoiceOver taps and swipes, switch) and some forms of input are pointer based (e.g., mouse, touch, joy stick, head pointer, camera input=hands for RaP and Quad).
Some forms of focus-based alternative input can have a mode that allows for non-discrete/pointer-like exploration (e.g., iOS VoiceOver's double tap-hold-drag, and our own custom gesture for JT and GFLB).
If we can generalize input without comprising customizable options for output, I think that would be really nice. For example, people using a mouse do not need the same Voicing output as people using a keyboard.
From an implementation standpoint, we think of the iOS + VO type of slider input as functionally identical to mouse/touch input (since it is a touch event that we receive). So for the context of this issue I think this case in handled with 2 thoughts on the matter:
I am not sure what kind of consensus it will take to close this issue. It isn't clear to me from our conversation Friday and @jessegreenberg and @samreid agree with me, it would be a shame if we all actually did agree on closing this issue, but momentum was lost because it is such a large-scope of an issue.
From here, let's have a couple of people sign off or state that more conversation is needed. If more conversation is needed, I'm happy to keep this issue open. Please unassing after comment, even commenting that you have no strong opinion.
Also, if in the future we want to do this work, we can reopen it.
After https://github.com/phetsims/sun/issues/837#issuecomment-1476538213, I know that more discussion will be needed. Unassinging to be worked on when we want to further align the mouse/touch slider input with AccessibleValueHandler's keyboard input API.
This sentence from above is concerning to me: "Over in https://github.com/phetsims/sun/issues/837 @jessegreenberg and @samreid and I are really excited to actually bring mouse/touch and keyboard input further away from each other."
At some point in the future, it will be the case that all slider implementations will need to have alternative input capabilities. The current approach of picking and choosing which sims and which sim elements to include alternative input for is an artifact of current limitations in funding and personnel bandwidth to make this transition more rapidly. I do not think we should invest time into evolving the code to be consistent with a piecemeal approach, but rather evolve the code to be consistent with an integrated approach. Eventually all sims will be published with alternative input, by default. In fact, it's my understanding that this was intended to occur this year, but faced some setbacks. Perhaps it will be as early later this year that all new sims in development will be implemented with alternative input features by default.
Not sure if further discussion is needed here, for clarification or next steps, etc. but I do think my sentiment here is for all features. Where evolution towards integration with all of PhET's existing feature set is possible, we should heavily weight that approach. It will save us in the long term, and minimally instantiate our current piecemeal feature-specific processes (which are confusing for many on the team, I think).
Sounds good, after the above comments lets keep this open. I think to confirm that we can do this we should review all of PhET's sliders to see how constrainValue
is being used. If a single "step" based API can replace all of them, this issue will be easier. If not, we should have a subteam of devs and designers review the needs of Slider and identify an API that would apply to all forms of input and support all cases.
Even if there are cases where constrainValue
doesn't map well to a universal "step", we should still add a universal "step" and remove as many constrainValues as possible. I'm not planning to work on this anytime soon.
From a slack discussion and out of https://github.com/phetsims/sun/issues/702, there was a suggestion to reconsider the API of Slider for all types of input. For mouse/touch the value during drag is constrained just with
constrainValue
. AccessibleValueHandler, which is used by Slider, provideskeyboardStep
,shiftKeyboardStep
andpageKeyboardStep
. But the latter three options are independent ofconstrainValue
in thatmouse
andtouch
will constrain the value differently from other forms of input. This is generally not how sliders work, if you specify astep
on an HTML input of type range, it applies the value constraint to all forms of input and this seems really nice. This issue is to investigate if we can do this and to reconsider the API of slider to support this.Related to https://github.com/phetsims/sun/issues/698 as well, this (might?) solve that.