quailjs / quail

Accessibility testing in the browser and on the server.
http://quailjs.org
Other
300 stars 44 forks source link

Test for WCAG technique G87: Providing closed captions #93

Open hannolans opened 10 years ago

hannolans commented 10 years ago

The objective of this technique is to provide a way for people who have hearing impairments or otherwise have trouble hearing the dialogue in synchronized media material to be able to view the material and see the dialogue and sounds - without requiring people who are not deaf to watch the captions. With this technique all of the dialogue and important sounds are embedded as text in a fashion that causes the text not to be visible unless the user requests it. As a result they are visible only when needed. This requires special support for captioning in the user agent.

Procedure

hannolans commented 10 years ago

A first simple check would be to check if a srt-file or webvtt is provided, but sadly that doesn't say that if no subtitling is provided a video should fail. It can be a video without dialogues and important sounds. To check a video we could analyse the video. For example, a certain failure would be if it's a talking heads video and there is no caption provided. To analyse video, we could render the video in canvas and take captures. There seems to be libraries to do face detection and further image analysis: http://wesbos.com/html5-video-face-detection-canvas-javascript/ http://libccv.org/ An even better way would be to analyse the audio track with the web audio API. And if the browser QUAIL is running on is webkit, we could use the realtime using Web Speech API on JavaScript that do speech to text . The speech will get analysed (in Chrome webkit by Google) and you'll get the transcription back. Google has a session limit of 60 seconds, but that should be enough for us to detect if there are valid captions provided. http://stiltsoft.com/blog/2013/05/google-chrome-how-to-use-the-web-speech-api/

So test would be:

  1. start video where human voice is assumed
  2. play 30 seconds of video and send the audio via de streaming API to the web speech API.
  3. check if there are captions for that given time span
  4. check if the words that the web speech API returns and the captions match (to a certain degree) Tests 3 and 4 should be true.
hannolans commented 10 years ago

Talked with Arjan and this test could better handle the detection of (valid) caption files only, so if a technique is used. Video analysis could be handled in "F8: Failure of Success Criterion 1.2.2 due to captions omitting some dialogue or important sound effects". Added new issue for this failure: #152

hannolans commented 10 years ago

Ok, this will leave the test to discovering caption files:

  1. test for an object element (main page or in iframe) with as parameter a file with a video extension, this will cover most of the video players like jw player) OR
  2. test for an html5 video element OR other techniques like test for an embed element (we might later add more exotic technologies)
  3. test if it includes caption/track files
  4. test if this file is in the language of the page inherit element (so it is not a translation file in another language)
  5. test if the file is not empty
hannolans commented 10 years ago

Great that html5 video is covered. I think a object embed is not covered yet.

Is for example this video covered? http://www.rijksoverheid.nl/documenten-en-publicaties/videos/2014/01/24/persconferentie-na-ministerraad-24-januari-2014.html

The html includes: `

`

A test would be:

hannolans commented 10 years ago

We could add .mp4 as well to check for in the param.

kevee commented 10 years ago

I'm starting a video-captions branch to move some code into more components rather than putting it all in videoEmbeddedOrLinkedNeedCaptions.

hannolans commented 10 years ago

Great idea. we could then also add a condition test whether it is live or recorded video.

kevee commented 10 years ago

Merged in video-captions branch, any additional use cases we need to capture?