promotedai / ios-metrics-sdk

iOS client library for Promoted.ai metrics tracking.
MIT License
7 stars 1 forks source link

Handle logging anomalies #175

Closed yunapotamus closed 2 years ago

yunapotamus commented 2 years ago

Background

Partner mobile teams break Promoted metrics integration without realizing it. In addition, these mobile teams have different engineering infrastructure in place. This means that:

  1. Automated testing may not be available.
  2. Not all features, especially “invisible” features such as metrics collection, are tested prior to release.

Philosophy

  1. Catch these problems as soon as possible. During development, before any code is merged, is ideal.
  2. Normally we’d do this using automated tests, but our partners seldom have the testing infrastructure for this. We could try writing tests for them, but partners may not have the discipline to run tests/block on failing tests during development.
  3. Make it apparent what the error is, and make it hard not to notice that you’ve broken something.

Proposal

Build failsafes into the Mobile Metrics SDK that trigger when the app is running under dev mode. Make these failsafes check at runtime.

Breaking scenarios to check for:

  1. Log user ID is missing. (This implies initialization failure.)
  2. Impression: No insertion or content ID in Delivery.
  3. Action: No insertion, impression, or content ID in Delivery.

Make the logger class return errors/warnings as the result of the logging calls.

Validate the results of these calls further down the line when in dev mode. Ignore these results in production.

If any errors or warnings occur, interrupt execution of the app with a detailed explanation of what has happened.

Logging anomaly handling

Behavior on logging error/warning (exposed in ClientConfig as an enum property): Nothing Show UI Break in debugger Log entire content object (optional)

Have clear and explicit escalation path (ie. email alias help+blah@promoted.ai, Slack channel, etc).