skyjake / lagrange

A Beautiful Gemini Client
https://gmi.skyjake.fi/lagrange/
BSD 2-Clause "Simplified" License
1.21k stars 64 forks source link

Integration with Accessibility APIs (e.g., screen reader support) #186

Open Packbat opened 3 years ago

Packbat commented 3 years ago

I recently installed the screen reader NVDA in order to try it out and learn, as a sighted person, the basics of how to use screen readers and what it's like to do so. However, I discovered that Lagrange doesn't support it - NVDA can read the name of the window but it cannot read any of the text on the page, much less alt-text for preformatted blocks.

I actually have no idea what's involved in making a program work with screen readers? From this A List Apart blog post, it sounds like they use OS-specific accessibility APIs, and other search results I saw suggest that there are cross-platform libraries for these? But I wanted to pass the word that screen reader accessibility is a thing so it'd be on your radar.

skyjake commented 3 years ago

Yeah this is one of the drawbacks of doing custom text rendering and UI controls...

other search results I saw suggest that there are cross-platform libraries

Such a cross-platform library, if one exists, would likely be the most feasible solution here. Anything OS-specific quickly becomes unwieldy.

Packbat commented 3 years ago

That makes sense. For what it's worth, I did a little more looking around and found a comment from December in a GitHub thread for a different project with a bunch of links - it might be a starting point?

MosqueteirosTraders commented 3 years ago

I found this on the subject

https://developer.android.com/studio/intro/accessibility https://developer.apple.com/documentation/objectivec/nsobject/uiaccessibility#//apple_ref/c/tdef/UIAccessibilityNotifications https://www.reddit.com/r/kivy/comments/3t10n1/accessibility_examples_for_kivy_screen_readers/

https://www.trivedigaurav.com/blog/towards-making-kivy-more-accessible/ https://www.trivedigaurav.com/blog/towards-making-kivy-apps-accessible-2/ https://github.com/kivy/kivy/pull/1909

@skyjake Now it's up to you to talk to the tshirtman what do you think?

nytpu commented 3 years ago

So, according to those links, you'd need to write support for every single operating system and every single linux windowing toolkit? And since lagrange uses sdl that means it'd be linking in at least 3 entire toolkits (for just windows/mac/linux desktop support) just to use the accesibility apis? And even though it uses sdl you'd force people to have qt or gtk to compile it? That seems like the most surefire way that whoever designs this stuff could've chosen to stop people from making their software accessible.

skyjake commented 3 years ago

I am committed to keeping Lagrange small and nimble, so there are practical limits to what can be achieved here.

I'll incorporate accessibility features when they are reasonable given the choice of underlying frameworks (i.e., SDL). The great thing about Gemini is that thanks to the diversity of clients, each one doesn't have to target 100% of the user base — there can be clients specifically optimized for accessibility, for instance.

Packbat commented 3 years ago

That makes sense to me as well. Barring the discovery of something better, it sounds like accessibility API integration is out of scope.

In the meantime, would something like self-voicing - i.e. Lagrange itself generating a voice to read out the text - be an option? Obviously there's a lot of configuration work that people will have already done on their screen readers that would be unavailable in that case, but it might cover some users' needs well enough. The screen reader I downloaded, NVDA, defaults to using the eSpeak NG speech synthesizer, which is open-source and apparently has versions for a lot of different platforms, but I can't say I know how it works.

skyjake commented 3 years ago

would something like self-voicing - i.e. Lagrange itself generating a voice to read out the text - be an option?

Interesting idea. If we can assume that the user already has an app/utility for synthesizing speech, this might be reasonable from an implementation point of view. And on macOS, there is the built-in say shell command that can be used for speech.

At least in theory, I could use these to read out text in the UI and even page content, if I can figure out how the interaction model should work (i.e., keyboard focused? what to say and when?).

I recently installed the screen reader NVDA in order to try it out and learn, as a sighted person, the basics of how to use screen readers and what it's like to do so.

I'll also have to do some research with screen readers as I haven't really used them either.

alyssarosenzweig commented 3 years ago

The great thing about Gemini is that thanks to the diversity of clients, each one doesn't have to target 100% of the user base — there can be clients specifically optimized for accessibility

True, Firefox and Chrome have to be screen-reader accessible because there are no alternatives, whereas Gemini can have a visual client and a screenreader client and the duplicated code is a rounding error. While this works for users that never/always use a screen reader, it might provide a suboptimal experience to users who sometimes use a screen reader, depending on circumstance. Also, Lagrange-specific niceties (like feed management) might not get ported to a screen-reader focused client. I'm interested to know in practice who is affected.

CyberTailor commented 3 years ago

Supporting screen readers isn't the only way to improve accessibility. GNOME has great guidelines on it https://developer.gnome.org/accessibility-devel-guide/3.38/gad-ui-guidelines.html.en

alyssarosenzweig commented 3 years ago

I had not seen those guidelines before -- thank you! Those look especially well thought out, and indeed cover a spectrum of disabilities (no pun intended).


"Don't assume that a user will hear audio information. This applies as much to users with broken soundcards as it does to those with hearing impairments!"

Written with Linux users in mind :-)

(My soundcard is still broken, but eventually I got an external USB soundcard to workaround it. It works 60, maybe 70% of the time.)

ghost commented 3 years ago

I would read through the UI Automation docs, here. It isn't dependent on a specific UI Framework, and it also provides a way to provide for both UI Automation and MSAA clients. I've only read a very small portion of the docs, but at first glance, it doesn't seem like it would be as hard as I expected. I could be wrong though. I will be doing more research on this and the linux APIs as well.

UI Automation providers can provide information to Microsoft Active Accessibility clients, and Microsoft Active Accessibility servers can provide information to UI Automation client applications. However, because Microsoft Active Accessibility does not expose as much information as UI Automation, the two models are not fully compatible.

Microsoft Active Accessibility is based on the Component Object Model (COM) with support for dual interfaces, and therefore, is programmable in C/C++ and scripting languages.

UI Automation client applications can be written with the assurance that they will work on multiple Microsoft Windows control frameworks. The UI Automation core masks any differences in the frameworks that underlie various pieces of the UI. For example, the Content property of a Windows Presentation Foundation (WPF) button, the Caption property of a Microsoft Win32 button, and the ALT property of an HTML image are all mapped to a single property, Name, in the UI Automation view.

UI Automation provides full functionality in Windows XP, Windows Server 2003, and later operating systems.

skyjake commented 3 years ago

Comment I wrote in #322:

My current thinking is that the path of least resistance here is to make it possible to have alternative UI chrome implementations in the app, so you could build one with GTK+ for instance, solving both the AT-SPI and Flatpak Portal issues and generally having better integration on a "standard" Linux desktop. Similarly, on iOS for instance, you could build the UI with native controls to have support for iOS accessibility features. However, this is not a trivial undertaking, so it needs to be worked towards gradually.

paxcoder commented 3 years ago

Please make it accessible, because there's no gemini client for linux which just works. There is one for windows, geminaut, but not for linux, I tried all off them, all not accessible or tedious.

CyberTailor commented 1 year ago

This looks promising: https://accesskit.dev/how-it-works/

It's written in Rust but has C APIs.

skyjake commented 1 year ago

@CyberTailor Agreed, very promising. I'll keep an eye on it and see how it progresses. Seems to be in an early development phase currently.

CyberTailor commented 4 months ago

AccessKit developer now works on a new accessibility API for Wayland to replace AT-SPI https://blogs.gnome.org/a11y/2024/06/18/update-on-newton-the-wayland-native-accessibility-project/