Repository used during WCAG 2.1 development. New issues, Technique ideas, and comments should be filed at the WCAG repository at https://github.com/w3c/wcag.
In the term "assistive technology" in the Glossary section (https://www.w3.org/TR/WCAG21/#glossary), add this item to the list of assistive technologies considered in the WCAG document:
"automatic sign language interpretation, which is used by people which are deaf and have literacy problems"_, at this way:
Assistive technologies that are important in the context of this document include the following:
screen magnifiers, and other visual reading assistants, which are used by people with visual, perceptual and physical print disabilities to change text font, size, spacing, color, synchronization with speech, etc. in order to improve the visual readability of rendered text and images;
screen readers, which are used by people who are blind to read textual information through synthesized speech or braille;
automatic sign language interpretation, which is used by people which are deaf and have literacy problems.
text-to-speech software, which is used by some people with cognitive, language, and learning disabilities to convert text into synthetic speech;
speech recognition software, which may be used by people who have some physical disabilities;
alternative keyboards, which are used by people with certain physical disabilities to simulate the keyboard (including alternate keyboards that use head pointers, single switches, sip/puff and other special input devices.);
alternative pointing devices, which are used by people with certain physical disabilities to simulate mouse pointing and button activations.
The advent of assistive technologies for translating text into sign languages solved an old problem for accessibility: the difficulty of translating all contents by human sign language interpreters, as well explained @GreggVan in issue #281:
"Level AA was for everything else important, testable, generally applicable to all content
Level AAA was for things important, testable, but not generally applicable (could not be applied in all cases.) In one case, it was because there were not enough sign language interpreters in existence to provide sign language interpretation for all web pages. This was for people wanting to go the extra mile."
However, there are currently several applications that use avatars as sign language interpreters to automatically translate text effectively, such as VLibras (open source, http://vlibras.com.br), HandTalk (commercial software with free version, https://handtalk.me/) and ProDeaf (commercial software with free version, http://www.prodeaf.net/en-us/Home).
One can also argue that the use of sign language goes far beyond the "extra mile". Signal languages on the web are necessary for many reasons:
Demographics data: Over 5% of the world’s population – 360 million people – has disabling hearing loss (328 million adults and 32 million children)
Literacy and access to information: 80% of deaf
people worldwide have an insufficient education and/or literacy problems, lower verbal
skills and mostly chaotic living condition
Reading ability: many deaf and hard-of-hearing people, particularly those for whom sign language is the first language, have reading difficulties.
(full explanation below as well as reference links)
Therefore, the additional improvements are proposed:
proposal A - issue #507 :
Guideline 1.1 Text Alternatives
https://www.w3.org/TR/WCAG21/#text-alternativesProvide text alternatives for any non-text content so that it can be changed into other forms people need, such as large print, braille, speech, symbols, human or automatic sign language interpretation or simpler language.
proposal B - issue #508 :
Success Criterion 1.2.6 Sign Language (Prerecorded)
https://www.w3.org/TR/WCAG21/#sign-language-prerecordedLevel AA (Double A) in place of AAA (Triple A)Human or automatic sign language interpretation is provided for all prerecorded audio content in synchronized media.
proposal D - issue #511 :
Glossary:
https://www.w3.org/TR/WCAG21/#glossaryhuman or automatic sign language interpretation
translation of one language, generally a spoken language, into a sign language, made by human interpreters or assistive technologies.
True sign languages are independent languages that are unrelated to the spoken language(s) of the same country or region.
Arguments and motivations for providing sign language video on the Web:
DEBEVC, Matjaž; Kosec, Primož; Andreas Holzinger. Improving multimodal web accessibility for deaf people: sign language interpreter module.
https://online.tugraz.at/tug_online/voe_main2.getVollText?pDocumentNr=137845&pCurrPk=47669
– Demographics data.
– Literacy and access to information.
– Reading ability.
– Navigation ability.
– Multilanguage requirements.
Demographics data (http://www.who.int/mediacentre/factsheets/fs300/en/)
According to World Health Organization (WHO, 2017):
“Over 5% of the world’s population – 360 million people – has disabling hearing loss (328 million adults and 32 million children). Disabling hearing loss refers to hearing loss greater than 40 decibels (dB) in the better hearing ear in adults and a hearing loss greater than 30 dB in the better hearing ear in children. The majority of people with disabling hearing loss live in low- and middle-income countries.
Approximately one third of people over 65 years of age are affected by disabling hearing loss. The prevalence in this age group is greatest in South Asia, Asia Pacific and sub-Saharan Africa.
'Hard of hearing' refers to people with hearing loss ranging from mild to severe. People who are hard of hearing usually communicate through spoken language and can benefit from hearing aids, cochlear implants, and other assistive devices as well as captioning. People with more significant hearing losses may benefit from cochlear implants.
'Deaf' people mostly have profound hearing loss, which implies very little or no hearing. They often use sign language for communication.”
Literacy and access to information
According to Debevc, Kosec and Primož (2010):
“Based on data collected by the World Federation of the Deaf (WFD), around 80% of deaf
people worldwide have an insufficient education and/or literacy problems, lower verbal
skills and mostly chaotic living condition. Other research shows that deaf people are
often faced with the problem of acquiring new words and notions [25, 27]. Because of the
many grammar differences between their mother tongue and sign language, the deaf person
might be fluent in sign language while experiencing problems reading their mother tongue.
According to Holt, the majority of deaf 18-year-olds in the United States have poorer
reading capabilities of English in comparison to 10-year-old students who are not deaf [18].
Some studies that have examined the reading ability of deaf 16-year-olds have shown
that about 50% of the children are illiterate. Of these, 22% displayed a level of knowledge
equivalent to that of a 10-year-old child who is not deaf and only 2.5% of participants
actually possessed the reading skills expected for their age [14]. Also, other studies in the
United States by Farwell have shown that deaf people face difficulties when reading written
text [11]. The average literacy rate of a high school graduate deaf student was similar to that
of a non-deaf student in the third or fourth grade.
Access to information is also important in cases of emergency. Past disasters
around the world have shown that, at the time of an accident, people with disabilities
did not receive the same assistance and appropriate information as other people did.
The United Nation Convention [38] calls upon States to develop measures for emergency
services (article 9 (1) (b)). Messages using video for deaf people have rapidly become
one of the more popular methods for sending them information but, unfortunately, most
countries’ emergency services do not allow for video communications with deaf people.
The reason lies in the communication protocols, which are not compatible with each
other.”
Reading ability
According to Debevc, Kosec and Primož (2010):
“It is surprising and disappointing that many deaf and hard-of-hearing people, particularly
those for whom sign language is the first language, have reading difficulties [13].
The problem arises because writing was developed to record spoken language and
therefore favours those with the ability to speak. Spoken language contains phonemes
which can be related to the written word by mind modelling. Due to the lack of audible
components and consequently difficulties in understanding written words, this can not
be done by deaf people.”
Navigation ability
According to Debevc, Kosec and Primož (2010):
“Another motivation for the integration of sign language videos into web sites is that sign
language improves the taxonomic organization of the mental lexicons of deaf people. The
use of sign language on the Web would, in this case, reduce the requirements for the rich
knowledge of the words and notions of another language. A knowledge of words and
notions is of the utmost importance for navigation in and between web sites and for the use
of hyperlinks, such as in online shopping web sites where a lot of different terms and subterms in vertical and horizontal categories appear. Unfortunately, deaf people have
problems understanding the meaning of words and notions, especially when it is necessary
to understand certain notions in order to correctly classify and thus understand either a word
or another notion [25].
Multilanguage requirements
According to Debevc, Kosec and Primož (2010):
“One of the important requirements is multilanguage support, especially in Europe. For
example, tourist information, governmental and emergency service web designers need to
construct web sites in English, German, Italian, French and even in Hungarian. This is
particularly true for small countries such as Slovenia, which are surrounded by several
countries with rich language backgrounds. In some countries sign language is also
recognized as an official national language, and therefore there is a strong need to include
sign language translations into web sites.”
Thank you for your comment. Since we are restricted from changing WCAG 2.0 SC and definitions in WCAG 2.1 I am marking this issue to defer so we can discuss it at the right time and closing it.
In the term "assistive technology" in the Glossary section (https://www.w3.org/TR/WCAG21/#glossary), add this item to the list of assistive technologies considered in the WCAG document: "automatic sign language interpretation, which is used by people which are deaf and have literacy problems"_, at this way: Assistive technologies that are important in the context of this document include the following:
The advent of assistive technologies for translating text into sign languages solved an old problem for accessibility: the difficulty of translating all contents by human sign language interpreters, as well explained @GreggVan in issue #281:
However, there are currently several applications that use avatars as sign language interpreters to automatically translate text effectively, such as VLibras (open source, http://vlibras.com.br), HandTalk (commercial software with free version, https://handtalk.me/) and ProDeaf (commercial software with free version, http://www.prodeaf.net/en-us/Home).
One can also argue that the use of sign language goes far beyond the "extra mile". Signal languages on the web are necessary for many reasons:
(full explanation below as well as reference links)
Therefore, the additional improvements are proposed:
proposal A - issue #507 : Guideline 1.1 Text Alternatives https://www.w3.org/TR/WCAG21/#text-alternatives Provide text alternatives for any non-text content so that it can be changed into other forms people need, such as large print, braille, speech, symbols, human or automatic sign language interpretation or simpler language.
proposal B - issue #508 : Success Criterion 1.2.6 Sign Language (Prerecorded) https://www.w3.org/TR/WCAG21/#sign-language-prerecorded Level AA (Double A) in place of AAA (Triple A) Human or automatic sign language interpretation is provided for all prerecorded audio content in synchronized media.
proposal D - issue #511 : Glossary: https://www.w3.org/TR/WCAG21/#glossary human or automatic sign language interpretation translation of one language, generally a spoken language, into a sign language, made by human interpreters or assistive technologies. True sign languages are independent languages that are unrelated to the spoken language(s) of the same country or region.
Arguments and motivations for providing sign language video on the Web: DEBEVC, Matjaž; Kosec, Primož; Andreas Holzinger. Improving multimodal web accessibility for deaf people: sign language interpreter module. https://online.tugraz.at/tug_online/voe_main2.getVollText?pDocumentNr=137845&pCurrPk=47669 – Demographics data. – Literacy and access to information. – Reading ability. – Navigation ability. – Multilanguage requirements.
Demographics data (http://www.who.int/mediacentre/factsheets/fs300/en/) According to World Health Organization (WHO, 2017): “Over 5% of the world’s population – 360 million people – has disabling hearing loss (328 million adults and 32 million children). Disabling hearing loss refers to hearing loss greater than 40 decibels (dB) in the better hearing ear in adults and a hearing loss greater than 30 dB in the better hearing ear in children. The majority of people with disabling hearing loss live in low- and middle-income countries. Approximately one third of people over 65 years of age are affected by disabling hearing loss. The prevalence in this age group is greatest in South Asia, Asia Pacific and sub-Saharan Africa. 'Hard of hearing' refers to people with hearing loss ranging from mild to severe. People who are hard of hearing usually communicate through spoken language and can benefit from hearing aids, cochlear implants, and other assistive devices as well as captioning. People with more significant hearing losses may benefit from cochlear implants. 'Deaf' people mostly have profound hearing loss, which implies very little or no hearing. They often use sign language for communication.”
Literacy and access to information According to Debevc, Kosec and Primož (2010): “Based on data collected by the World Federation of the Deaf (WFD), around 80% of deaf people worldwide have an insufficient education and/or literacy problems, lower verbal skills and mostly chaotic living condition. Other research shows that deaf people are often faced with the problem of acquiring new words and notions [25, 27]. Because of the many grammar differences between their mother tongue and sign language, the deaf person might be fluent in sign language while experiencing problems reading their mother tongue. According to Holt, the majority of deaf 18-year-olds in the United States have poorer reading capabilities of English in comparison to 10-year-old students who are not deaf [18]. Some studies that have examined the reading ability of deaf 16-year-olds have shown that about 50% of the children are illiterate. Of these, 22% displayed a level of knowledge equivalent to that of a 10-year-old child who is not deaf and only 2.5% of participants actually possessed the reading skills expected for their age [14]. Also, other studies in the United States by Farwell have shown that deaf people face difficulties when reading written text [11]. The average literacy rate of a high school graduate deaf student was similar to that of a non-deaf student in the third or fourth grade. Access to information is also important in cases of emergency. Past disasters around the world have shown that, at the time of an accident, people with disabilities did not receive the same assistance and appropriate information as other people did. The United Nation Convention [38] calls upon States to develop measures for emergency services (article 9 (1) (b)). Messages using video for deaf people have rapidly become one of the more popular methods for sending them information but, unfortunately, most countries’ emergency services do not allow for video communications with deaf people. The reason lies in the communication protocols, which are not compatible with each other.”
Reading ability According to Debevc, Kosec and Primož (2010): “It is surprising and disappointing that many deaf and hard-of-hearing people, particularly those for whom sign language is the first language, have reading difficulties [13]. The problem arises because writing was developed to record spoken language and therefore favours those with the ability to speak. Spoken language contains phonemes which can be related to the written word by mind modelling. Due to the lack of audible components and consequently difficulties in understanding written words, this can not be done by deaf people.”
Navigation ability According to Debevc, Kosec and Primož (2010): “Another motivation for the integration of sign language videos into web sites is that sign language improves the taxonomic organization of the mental lexicons of deaf people. The use of sign language on the Web would, in this case, reduce the requirements for the rich knowledge of the words and notions of another language. A knowledge of words and notions is of the utmost importance for navigation in and between web sites and for the use of hyperlinks, such as in online shopping web sites where a lot of different terms and subterms in vertical and horizontal categories appear. Unfortunately, deaf people have problems understanding the meaning of words and notions, especially when it is necessary to understand certain notions in order to correctly classify and thus understand either a word or another notion [25].
Multilanguage requirements According to Debevc, Kosec and Primož (2010): “One of the important requirements is multilanguage support, especially in Europe. For example, tourist information, governmental and emergency service web designers need to construct web sites in English, German, Italian, French and even in Hungarian. This is particularly true for small countries such as Slovenia, which are surrounded by several countries with rich language backgrounds. In some countries sign language is also recognized as an official national language, and therefore there is a strong need to include sign language translations into web sites.”
References: DEBEVC, Matjaž; Kosec, Primož; Andreas Holzinger. Improving multimodal web accessibility for deaf people: sign language interpreter module (2010). https://online.tugraz.at/tug_online/voe_main2.getVollText?pDocumentNr=137845&pCurrPk=47669. Accessed in 6 out. 2017. World Health Organization. Deafness and hearing loss (2017). http://www.who.int/mediacentre/factsheets/fs300/en/. Accessed in 6 out. 2017.