w3c / webextensions

Charter and administrivia for the WebExtensions Community Group (WECG)
Other
595 stars 56 forks source link

Request for Comments: Extension Performance #707

Open jspivack opened 1 day ago

jspivack commented 1 day ago

Overview

Following up on the recent messages in the WECG matrix with @oliverdunk -- it feels like there is a need for greater observability and performance tracking within browser extensions, both holistically and for specific lenses into different parts of extension architecture (background scripts, content scripts etc). For a myriad of reasons, extensions share similarities with both traditional web applications and native/mobile distributions, but fall into a gap where neither platform's flavors of instrumentation are particularly well-suited for profiling them:

Goals

Primary: Developer experience

  1. Adapt and build upon existing Developer Tools to provide insight into loading, process execution, and dependency flow in browser extensions (see below)
  2. Opportunities to expand the extension Performance API directly

Secondary: Standards cross-team/group collaboration

  1. Integration/utilization of OpenTelemetry for extension-based tracing and metrics
  2. Re-defining and expanding on web performance metrics for the extension world... e.g. what is relevant and irrelevant for:

https://developer.mozilla.org/en-US/docs/Web/API/PerformanceNavigationTiming https://developer.mozilla.org/en-US/docs/Web/API/PerformancePaintTiming https://developer.mozilla.org/en-US/docs/Web/API/PerformanceResourceTiming https://developer.mozilla.org/en-US/docs/Web/API/PerformanceScriptTiming

Potential developer tools improvements

  1. Unified background + content script(s) timeline in Performance devtools tab
  2. Possibly include cross-context messaging (chrome.runtime.sendMessage) as part of the profiling stack trace or break out into a separate section in flame graph
  3. Mark extension-specific milestones (see below) on the performance timeline alongside / in lieu of Largest Contentful Paint, et al.
  4. Greater visibility and control into browser extension storage

Examples of relevant extension platform events to identify

These categories of milestones are critical in assessing performance and would help paint a clearer picture in how extensions are being bootstrapped and executed

  1. Navigation started / tab updated
  2. DOM Ready (initiated from background or content script)
  3. Content script dynamically registered
  4. Script injected/executed in isolated or main world
  5. Network request initiated. Could include the ability to distinguish between blocking and non-blocking (non-awaited) requests
  6. DOM insertion/mutation
  7. Other "target". The concept of a dev-defineable target in extension performance is interesting to me because of the ambiguity involved in ext. development and the number of very different outcomes comparatively. When loading a web app/site, the goal (target) is virtually always to load enough resources to start displaying UI to users as quickly as possible and then continue loading other data and portions of the screen. With extensions, it may be determined on a given page or session that activity should only occur in the background or that no action is necessary to take at all... in other words, nothing should be displayed to the user and the underlying contexts should continue to remain "invisible". Thus the target in these instances might be the determination itself that no action should be taken
birtles commented 1 day ago

For extensions whose content script is injected into a wide range of sites, privacy makes this really hard.

For example, Firefox has PerformanceWarnings and although I've been recording them for my extension and can see they do fire (very) occasionally, I can't record the URL where the warning occurred in order to investigate the issue since it could expose personal information (e.g. capability URLs). As a result, the reports are not particularly actionable.