Open alzwded opened 3 months ago
I see CI kicked off and testing failed, but it looks like a glitch:
Checking the license for package NDK (Side by side) 22.1.7171670 in /usr/local/lib/android/sdk/licenses
License for package NDK (Side by side) 22.1.7171670 accepted.
Preparing "Install NDK (Side by side) 22.1.7171670 (revision: 22.1.7171670)".
Warning: An error occurred while preparing SDK package NDK (Side by side) 22.1.7171670: archive is not a ZIP archive.:
java.util.zip.ZipException: archive is not a ZIP archive
<snip>
"Install NDK (Side by side) 22.1.7171670 (revision: 22.1.7171670)" failed.
FAILURE: Build failed with an exception.
* What went wrong:
A problem occurred configuring project ':app'.
> java.lang.RuntimeException: com.android.builder.sdk.InstallFailedException: Failed to install the following SDK components:
ndk;22.1.7171670 NDK (Side by side) 22.1.7171670
Install the missing components using the SDK manager in Android Studio.
Added a second patch which adds two Accessibility Actions to announce (= speak aloud) the cursor position, and the full line the cursor is on, respectively.
As of right now, there is nothing on the Android UI side to tell you this.
In the meantime, I'm continuing to explore more "intended" ways to represent a text area with a terminal style cursor, but I keep hitting roadblocks and dead ends. Hence, these two accessibility actions.
(This commit also reordered the actions and changed some resource strings and IDs added in the previous commit; I can shift the diff chunks around or squash the whole branch, just let me know what works best for reviewers).
Added a couple of new commits for extra keys (esc / end ctrl alt etc):
setTextEntryKey
, so that they behave like text entry keys (= soft keyboard keys) on recent versions of Android + TalkBack, thus respecting the setting (double tap to activate vs activate on touch up)Rebased to latest master.
Added an extension to the second commit in the series (the one which adds "Speak Cursor Position" and "Speak Cursor Line") which also speaks out a unicode description of the character under cursor in order to properly explain "where your input will go", which otherwise can be rather unclear since the emulated terminal's cursor position and the various Android cursors and accessibility cursors aren't in sync, and there's nothing to sync them right now.
I had to add a getChar
method on TerminalEmulator
and TerminalBuffer
, because getSelectedText
was a bit overkill.
Overall changes up to this point, all applicable when an accessibility service is running:
AccessibilityNodeInfo.setTextEntryKey
available in recent Android versions, which makes such buttons sensitive to a TalkBack setting having to do with typing preference on soft keyboards.CTRL
, it will announce "control on" or "control on, locked" after it's done speaking everything else. Without this, there would be no indication of this global state change.
tl;dr it's a lot easier to read the output of the last command and to get to the copy/paste/more... actions
Problem
It's a bit yiffy to navigate the wall of text in the terminal view. As things stand right now with content description, accessibility services see the
TerminalView
as an opaque blob with a wall-of-text description. Since it's a description, there is nothing locking you in the control when using reading controls (like you would be in aTextView
).Additionally, if I read the documentation of
setContentDescription
right, it is really meant to offer a description of non-textual elements, or of irritatingly opaque content; not to be the content itself. This is probably irrelevant, as the goal was to make it easier to get to the output of the last command.Solution
It is also nigh impossible to notice the popup menu (copy/paste/more...) if you don't... see it. It ends up at the end of the control stack and it is not announced (if I understand correctly, you have to issue an accessibility focus event, but the accessibility of the selection handle thing is a whole different problem to the one I'm trying to solve here).
The goal was to make the
TerminalView
behave more like a text view:Code Changes
Actual changes, on
TerminalView
:YES
if a view does any of the things below;sendAccessibilityEvent(TYPE_VIEW_TEXT_CHANGED)
when text changes and implementonPopulateAccessibilityEvent
to let accessibility services know the text has changed (and what it changed to)onInitializeAccessibilityNodeInfo
which is called by accessibility services to detect what is on screen; this gives us a chance to explain that thisTerminalView
is a multiline text container and not some random eye candy that may be skipped overAccessibilityNodeInfo
andAccessibilityEvent
; replaced with all of the abovestrings.xml
and added 3 new IDs toids.xml
to support the custom accessibility actions.onTouchEvent
, branchBUTTON_TERTIARY
, to use it for the accessibility paste actiongetText()
, checkmEmulator
exists, otherwise return blank; TalkBack might get there firstLimitations
The Copy Screen action is a compromise since
TerminalView
doesn't report on the cursor. I'm still going through TalkBack's code trying to understand what it actually expects for things to work right. There are actions which are hard coded to the class beingandroid.widget.TextEdit
(source1, source2; but I'm still trying to understand what's going on there and in the braille code; also, that's for TalkBack 14, there are many versions of TalkBack in use out there). In any case, the correct way to track the cursor between the terminal emulator's scroll back and the accessibility view of the text is a whole separate opinionated discussion, which could be left for later...Something ought to be done about the side panel, since you rather have to know it's there. But since the More... menu is now easily discoverable, someone with TalkBack might stumble onto the Help panel and read that there's a secret side panel on the left. That wouldn't be much different to the way I myself discovered that sessions side panel on the left :-)
It would be worth adding landmarks for the terminalview, button bar and side panel, but that's a different PR, since it probably needs to be done in the Activity, not here.
There's also something iffy about the fact that
AccessibilityManager.isEnabled()
is only checked at start up. Maybe it should be documented somewhere that accessibility is checked only at startup for performance reasons. For people who turn TalkBack on after launching the app, it might appear broken, even though it really isn't. I didn't want to touch that behaviour since it was something requested in the review of !344. I am merely stating my dissent.To add to
mAccessibilityEnabled
: the a11y related method overrides are simply not called if there's no active accessibility service, hence no checks needed.Someone also mused about having the terminal speak out text as it comes in, but that's also a different PR, since it rather requires interplay between the TerminalEmulator code and TerminalView (providing accessibility), plus setting up a virtual view that is live; and I assume people might want to turn that off / control verbosity, which is a lot of work, so... not in this PR.
It is theoretically possible to do partial text updates in
onPopulateAccessibilityEvent
, (instead of blasting away the whole screen each time you type a character) but I'm not convinced it actually works the way I hope, and it requires a lot of work to get there (see previous paragraph). So... not in this PR.But this PR makes interacting with the TerminalView itself much nicer if you can't see what you're doing, and there's no reason to require people to see what they're doing in this case.