Closed PouriaAmini closed 4 months ago
1/ Especially on your first few commits to a new language / project, please try to keep PRs small - this is too much code to effectively review in one go. 2/ I am concerned with the overall capture approach - while it is nice that it works for both Swift and SwiftUI, most apps do not put a lot of thought into a11y, and we are ignoring much of the context we can get from a closer integration with UIKit. Things like respecting gesture recognizer failure hierarchies are important to understanding which events actually fired.
Thanks @crleona for the review! 1/ I'll convert this into a draft and try to break this code into multiple PRs. 2/ The primary reason why I was lenient towards extracting analytics from accessibility metadata was that many of the built-in component in SwiftUI already have descriptive accessibility data - for example any button that has a label will have an accessibility label by default. The primary reason for opting for global gesture recognition than swizzling concrete subclasses of UIGestureRecognizer was that many of the SwiftUI components don't use UIGestureRecognizer.
Can we merge this pr to a feature branch like we discussed?
@Mercy811 - I'm thinking that since we are going to break this PR into chunks, it would minimize its impact on the SDK. So I'm not sure if having a feature branch would make the introduction of the new feature more safe. Unless we want to test the implementation with a set of customers first before merging in the feature. I'm open to further discussion.
2/ The primary reason why I was lenient towards extracting analytics from accessibility metadata was that many of the built-in component in SwiftUI already have descriptive accessibility data - for example any button that has a label will have an accessibility label by default. The primary reason for opting for global gesture recognition than swizzling concrete subclasses of UIGestureRecognizer was that many of the SwiftUI components don't use UIGestureRecognizer.
It does sound like this may be as good as we can get in SwiftUI, so we should keep this approach as a fallback - perhaps enabled by a different config if we ever run into trouble with loading a11y libs. I'd like to see where we can get with a UIKit only approach as well - there should be a lot more context we can gather, and we can get a closer mapping of tracked events to events that the app is actually taking action on.
Summary
This PR adds support for the auto-capture of user interactions with UI elements. The design is heavily based on the accessibility metadata available within the app for each UI element, which is mostly provided by the UI framework. Auto-capture of user interactions works for apps built with both UIKit and SwiftUI. To enable user interaction auto-capture, set the
userInteractions
parameter of theDefaultTrackingOptions
initializer totrue
:New Automatically Captured Event
[Amplitude] User Interaction
: Emitted when a user interacts with a UI element.Note: To capture the event, the UI element must include some accessibility metadata (
accessibilityLabel
,accessibilityValue
, and/oraccessibilityTraits
). Otherwise, the interaction event will be generic and will not contain information about the UI element with which the user interacted.New Automatically Captured Properties
[Amplitude] Element Label
: The accessibility label of the UI element the user interacted with. If empty, the property will be excluded.[Amplitude] Element Value
: The accessibility value of the UI element the user interacted with. If empty, the property will be excluded.[Amplitude] Element Type
: The accessibility traits of the UI element the user interacted with. If empty, the property will be excluded.[Amplitude] Interaction
: The type of the user interaction on the UI element. These interactions include:PR Note: Tests targeting these implementations and Objective-C support will be included in a separate PR to reduce the size of this PR.
Checklist