In Designing ancillary APIs that provide new information, there's a principle that user agents should enable users to turn off such APIs. While that can make users feel good about themselves, it'd actually hurt their privacy and increase their fingerprinting surface with a very meaningful bit.
It would be better to allow privacy conscious users to increase the identification mitigations of these APIs (e.g. extra fuzzing of timestamps, reduced accuracy of exposed dimensions, etc) instead revealing this very specific user preference.
385 ignores this problem. As you say here, it'd be plausible to instead say that ancillary APIs should be designed to add noise to any new information they expose, and then instead of disabling the APIs, users should be able to add so much noise that the data becomes useless. On the other hand, having variable amounts of noise in the data might make it too hard to use it at all. I think the idea of noising performance APIs is new enough that I'd like WebPerf to gather some more experience with it, and then propose something for this document.
In Designing ancillary APIs that provide new information, there's a principle that user agents should enable users to turn off such APIs. While that can make users feel good about themselves, it'd actually hurt their privacy and increase their fingerprinting surface with a very meaningful bit.
It would be better to allow privacy conscious users to increase the identification mitigations of these APIs (e.g. extra fuzzing of timestamps, reduced accuracy of exposed dimensions, etc) instead revealing this very specific user preference.