App is currently partially accessible due to (generally) proper use of UIKit and initial UI automation (by accessibility identifier) for screenshot generation – but I can imagine that a prompt like "So we know it's you", while great visually, wouldn't make too much sense as the only context clue if only heard in spoken word.
App is currently partially accessible due to (generally) proper use of UIKit and initial UI automation (by accessibility identifier) for screenshot generation – but I can imagine that a prompt like "So we know it's you", while great visually, wouldn't make too much sense as the only context clue if only heard in spoken word.
See: https://developer.apple.com/library/content/documentation/UserExperience/Conceptual/iPhoneAccessibility/Making_Application_Accessible/Making_Application_Accessible.html