Open DBDeveloper87 opened 6 years ago
@DBDeveloper87 thanks for your feedback. You've offered us a lot to consider! If you feel particularly strongly about one or more of these issues, I suggest creating a new issue focused exclusively on that issue. That way it will be easier to focus the conversation. Meanwhile, three of the issues you raised are particularly interesting to me, so I've offered responses to them below:
<span>
tag, which is what makes the transcript interactive (if the user clicks on a phrase it jumps to that point in the video.) I suspect you're getting just one span at a time, which is essentially one caption at a time, rather than a flowing stream of text. Is there a setting within your Braille device that would cause it to read everything more seamlessly (sort of like word wrap)? If not, I could add a Preference or feature to Able Player that would strip out all the <span>
tags. However, that would also strip out the interactivity, and just leave you with a static transcript. Not necessarily a bad idea though. Hi: Good evening and good to hear back from you. I thought development might have stopped since I didn’t hear back after I posted two months ago. I’d be happy to make these individual issue items though. However, for now: 1: That is interesting. However, this example from Coursera is also a HTML5 video: https://www.coursera.org/lecture/gcp-exploring-preparing-data-bigquery/introduction-from-data-to-insights-with-google-cloud-platform-specialization-AmSpi https://www.coursera.org/lecture/gcp-exploring-preparing-data-bigquery/introduction-from-data-to-insights-with-google-cloud-platform-specialization-AmSpi
When I hit play, it opens a full screen mode, but the media controls themselves are not the native IOS ones so that means it’s a custom HTML5 video. I am able to use chord 1,5,6 to pause the video.
From the same example above, even though I could turn the captions on, I cannot access them via the Braille captions. However, analyzing the UI, I’m thinking it defaults to the IOS media player as all the controls are the iOS ones except the Media button that brings up the options for captions and subtitles.
I am going to re-test this, because my memory is telling me that even words were being chopped up instead of just caption strings. It’s almost the effect I’d get in a Word document if someone was editing it and sent me the “final” draft. It would be an accessibility mess unless you copy and past the paragraph into Notepad then back in to Word to clear any of the hidden formatting that changing the styles won’t do. However, one idea is to switch from two word strings to an entire sentence string, or between certain time stamps. But yeah, the being stripped out could remove interactivity. . . .but my gut is suggesting something involving JavaScript could make it appear to be a flow of text the same as it visually is and whereever you press with a routing key on the Braille display is where you’ll set the video to the corresponding time stamp.
Getting that to work, could make the transcript read like a “screen play” as a 3Play Media Webinar transcript once described. For example, organizing it by chapter, and those chapters are marked by a heading level, and then organizing it by the person speaking, which are marked by a different heading level. inerspered by the “audio” descriptions.
I have a few other ideas we can do to improve the Able Player even further.
Patrick
On Nov 4, 2018, at 7:48 PM, Terrill Thompson notifications@github.com wrote:
@DBDeveloper87 https://github.com/DBDeveloper87 thanks for your feedback. You've offered us a lot to consider! If you feel particularly strongly about one or more of these issues, I suggest creating a new issue focused exclusively on that issue. That way it will be easier to focus the conversation. Meanwhile, three of the issues you raised are particularly interesting to me, so I've offered responses to them below:
Regarding the inability to start the video using chord-1,5,6: I have only limited experience with Braille input devices, but I believe that chord command is the one used to toggle play and pause within the iOS native player, correct? If so, Apple has specifically built support for that and many other Braille commands into iOS. These commands only work for native iOS apps though, not for websites. Able Player is an HTML5 media player that's built into web pages, so it can't be operated in the same way that accessible iOS apps can. In order to build in control for Braille commands, we would need to have Braille event listeners, and to my knowledge those don't currently exist. Why not? I don't know the answer to that, but it seems like an issue worthy of discussing. Regarding captions not being accessible through the iOS 11 Braille captions feature: Again, that's a feature that's probably built for the iOS media player, but I suspect it doesn't support video that's embedded into a web page, although I could be wrong about that. The Able Player demo pages includes one page that features the browser's native HTML5 video player, for comparison. Does the Braille caption feature work with the Native Browser Player Demo http://localhost/ableplayer/demos/other1.html? Regarding the short phrases that comprise the interactive transcript as you experience it on a Braille device: That's because the transcript is assembled from timed text files (it parses the chapters, descriptions, and captions files and combines them into a transcript). Each individually caption is wrapped in a tag, which is what makes the transcript interactive (if the user clicks on a phrase it jumps to that point in the video. I suspect you're getting just one span at a time, which is essentially one caption at a time, rather than a flowing stream of text. Is there a setting within your Braille device that would cause it to read everything more seamlessly? If not, I could add a Preference or feature to Able Player that would strip out all the tags. However, that would also strip out the interactivity, and just leave you with a static transcript. Not necessarily a bad idea though. — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ableplayer/ableplayer/issues/330#issuecomment-435725149, or mute the thread https://github.com/notifications/unsubscribe-auth/Anpu4Qelk9KfOxQvdHKuGGQ9oXxEk_cGks5ur4rZgaJpZM4Va7_a.
In testing Able Player with iOS VoiceOver using the examples page, especially the text-based audio descriptions example, I have found there are some issues with the player. Your documentation states that iOS plays in the OS's default player instead of the actual Able Player; however, this does not appear to be the case on IOS 11.4. Instead, it plays inside the actual Able Player. Issues I found: --I cannot start the video using chord-1, 5, 6; rather I have to physically navigate to the play button first. After that I can play/pause using this chord on a Braille display. --Capations do not appear to be accessible through the iOS 11 Braille captions feature. --If the settings are set to automatically pause the tape whenever auto-descriptions are active, I can navigate to the AD itself and read it. --Interactive Transcript did not want to appear at first, I had to invoke the button twice or thrice via the routing keys. Combining the captions/subtitles with the audio-descriptions through this Transcript is quite easy to read via the Braille display. --Invoking Full Screen means I can no longer access the player UI via the Braille display and it only expands to fit the Safari window but the top nav bar of Safari and bottom nav bar are still visible. I was forced to reload to go back to the non-full screen interface. --Regarding Interactive Transcripts. While the appear like "blocks" on the visual interface (almost like paragraphs), this is not the case on a Braille display. The whole block is not displayed across the full 40 cells, instead only two or three words usually appear if it is from the captions/subtitles portion. Audio Descriptions render correctly. In my experience with improving documents for VoiceOver, this is because the paragraph appears to VoiceOver as being broken up into different elements. I found copying and pasting it into TextEdit, then back into the Word processor application and applying the approiipate styling fixed that issue. So I think VoiceOver is reading the array of strings that make up the block of text as individual elements instead of a block. Perhaps using a function that converts that array of strings into a full, single string would fix this issue.