w3c / xaur

XR Accessibility User Requirements
https://w3c.github.io/xaur/
Other
2 stars 1 forks source link

4.10 Integration of sign language in immersive environments [XAUR] #20

Open brewerj opened 3 years ago

brewerj commented 3 years ago

@RealJoshue108 Realizing we may be missing some aspects of integration of sign language in VR and AR. There could be several possibilities: rendering of multi-party sign-language communication in VR space; integration of sign language interpreting; eventual integration of machine-generated interpreting. This may need more development than possible for this round of publication. Add an editor's note to develop needs and requirements further and incorporate in an updated version?

Addendum, on closer inspection, found this 4.10, Text description transformation, which matches discussions in recent RQTF meetings. However, I still think that this only gets at part of the issue: if sign language can be generated from text, why not also from speech? Also, did we overlook the possibility of integrating streaming interpreting into VR and AR space? Lastly, should we consider re-titling 4.10 to including the output sought, e.g., "sign language," rather than what it's transformed from (text)?