WICG / interventions

A place for browsers and web developers to collaborate on user agent interventions.
Other
176 stars 28 forks source link

Touch listeners defaulting to passive #18

Closed RByers closed 2 years ago

RByers commented 8 years ago

Now that we have an API for passive event listeners, chromium wants to experiment with sometimes defaulting listeners into the passive state (in order to get results like this). If we can succeed in making some breaking changes for this, then we'll work with the TECG to explore changes to the TouchEvents spec.

Note that most browsers already have a form of intervention here in terms of a timeout. But this timeout does not follow most of the intervention guidelines and is in some ways the worst of both worlds. The hope here is that we can eliminate the need for such timeouts by replacing it with something that is both more rational/predictable for developers and provides a better user experience.

We only have a skeleton of a concrete proposal so far, but are collecting metrics from the wild to evaluate the tradeoffs of some possible proposals. Chrome 52 includes a flag allows users/developers to opt-in to a couple different modes of passive-by-default touch listeners.

EDIT 2/12/17: See #35 and this post for details of a specific intervention now shipping in Chrome 56. Updated to reflect shift in thinking from "forced" to just "default" (but overridable).

RByers commented 8 years ago

/cc @tdresser @dtapuska who are working on the engineering in chromium for this.

RByers commented 8 years ago

Note that we're thinking that this will start by applying only to cases where passive isn't specified like: addEventListener("touchstart", handler). That would then behave differently from addEventListener("touchstart", handler, {passive:false}). But this would be more of a migration strategy / hint than behavior we'd expect to keep long term (i.e. if CNN made their touch listener passive:false for some reason without fixing the jank, we'd still want to intervene on the user's behalf). So I don't think that distinction would ever really belong in the DOM spec. /cc @annevk @smaug---- @jacobrossi, thoughts?

smaug---- commented 8 years ago

If we start doing something like this, then it isn't clear to me at all anymore why we'd need 'passive'. Especially given that there is touch-action and such.

RByers commented 8 years ago

Interventions are primarily about improving the user experience at the expense of some developer rationality. A core part of the guidelines is that if developers follow best practices, they will never be impacted by an intervention. This lets us generate warnings / metrics around intervention triggering and drive them down as a thing to avoid. We can't do any of that without an explicit API where developers can opt-in in a rational manor to passive behavior.

RByers commented 8 years ago

Quick update:

smaug---- commented 8 years ago

Have you considered other options here, since this kind of change would make the platform less coherent internally and would add yet more special cases to the already complicated platform? Like, a browser could have a "performance hint console" to tell to the web developers that whatever they are doing is probably slowing down ux and what they could do instead is and if they want to keep the existing behavior but don't see the warning again do .

I'm rather worried that if we end up adding more and more "interventions", we end up polluting the platform with tons of small hacks and inconsistencies, and such things tend to beat us later, when designing new APIs or spec'ing how the platform works.

RByers commented 8 years ago

Yeah I'm worried about this too. We've long had devtools features highlighting scroll performance problems, and in general we've found they're helpful for developers motivated around perf, but when our goal is to improve the 95th percentile latency for users they're nearly useless (developers looking at perf in devtools are generally well below the 95th percentile, it's the sites where nobody is even measuring that matter most to the 95th percentile).

Long term ideally I think we'd aim to make touch events universally passive by default (basically the same way pointer events are, but with an opt-out if you really want it). That, I think, would be clean / rational from a web developers perspective. WDYT? Of course we'd need some transition path over many years to avoid breaking the web too badly in order to get there.

rjgotten commented 8 years ago

it's the sites where nobody is even measuring that matter most to the 95th percentile.

And your solution is adding hacks to the browser that affect every website everywhere and which can and do (note the referenced PhotoSwipe issue) suddenly break previously perfectly valid code which is intercepting touch events at the document level for anything as basic as page-wide drag&drop.

Here's a suggestion: put these hijinks behind a switch that devs can turn off with a <meta content="back-the-hell-off"/> tag, so there is an escape hatch until third-party libraries can catch up.

smaug---- commented 8 years ago

The idea here is that web sites could explicitly use non-passive listeners by passing the right kind of dictionary to addEventListener. But I agree, this kind of changes are a bit web dev hostile, which is why I was wondering if other mechanisms have been investigated to improve the ux of pages. Sounds like no.

dtapuska commented 8 years ago

The PhotoSwipe code is setup to use PointerEvents but they are using the proprietary detection mechanism so when Chrome ships pointer events (M55) and FireFox does later this year it wouldn't take advantage of them.

meta tags are difficult for libraries to override set. So the escape hatch here is to provide a fully defined value. But really in this case it should be switched to use pointer events as it would be more efficient.

RByers commented 8 years ago

And your solution is adding hacks to the browser that affect every website everywhere and which can and do (note the referenced PhotoSwipe issue) suddenly break previously perfectly valid code which is intercepting touch events at the document level for anything as basic as page-wide drag&drop.

Yep, that's the nature of interventions: make the experience substantially better for a LARGE number of users at the cost of some small compat / developer pain cost. There's definitely a legitimate concern here about striking a good tradeoff, but in general the user experience has gotten so bad on the mobile web (putting the entire platform at such risk of economic collapse) that I don't think anyone involved really believes the right tradeoff is to land entirely on the side of developers over users. Our (Google web platform team's) thinking on that is mostly summarized here and yes it definitely includes that developers should usually have a way to opt-out (as they do in this case).

But note that if you're already following best practices (eg. making your site work correctly with touch on Microsoft Edge) then you'll already have a touch-action: none rule in such case and your site will continue to work fine. Even if you don't, it's likely your site will still work ok (eg. if it's really a full screen drag and drop then the page won't be scrollable). Specific counter examples appreciated, making the ideal tradeoff is challenging.

RByers commented 8 years ago

And just to make sure we're all clear on the benefit - we're talking about giving all users the experience on MANY major websites seen on the right of this video: https://www.youtube.com/watch?v=NPM6172J22g. Given the huge positive response that video has gotten from users, we're willing to accept a little bit of hacks / compat pain here.

rjgotten commented 8 years ago

meta tags are difficult for libraries to override set.

Hence the libraries will have to fix their problems. A meta tag switch would be an escape hatch for developers depending on third-party libraries that have not been updated yet.

developers should usually have a way to opt-out (as they do in this case).

They're effectively stuck until all the libraries their project relies on, update to handle what is essentially a breaking change in their years of handling touch events.

Complex touch event scenarios are a [[censored]] nightmare hellscape. And now you're saddling devs with either switching to a different library (with its own potential weaknesses) or forcing them to dive into the guts of those libraries themselves to sort it out. If you call that a viable opt-out, you're mad.


Besides: do you know what the easiest way is to patch those issues? You override whatever abstraction for addEventListener the library uses to detect passive event listener support and include a passive:false that is hardoced for all events and which gives Chrome the finger. Road of least resistance. Road of least cost. And a road which ends in a status quo that is atleast known to work correctly, even if it performs worse.

Guess what solution companies that are already not interested in performance are going to use?

tdresser commented 8 years ago

In this case, developers don't need to wait for libraries to update, they can apply touch-action to the parts of their page that should prevent scrolling.

rjgotten commented 7 years ago

In this case, developers don't need to wait for libraries to update, they can apply touch-action to the parts of their page that should prevent scrolling.

In many complex interaction cases touch-action needs to be applied and unapplied dynamically, determined by the user's interaction state with various parts of a UI. That interaction state may well be under the control of a third-party library that is used to render part of the UI; or that implements an abstraction on top of touch events for complex user gestures that a UI requires.

So please explain again why you believe developers wouldn't need to wait for libraries to update, because there certainly are plenty of cases where they will...

maxkfranz commented 7 years ago

And just to make sure we're all clear on the benefit - we're talking about giving all users the experience on MANY major websites seen on the right of this video: https://www.youtube.com/watch?v=NPM6172J22g. Given the huge positive response that video has gotten from users, we're willing to accept a little bit of hacks / compat pain here.

In my opinion, It should be the responsibility of those major websites to update their code to opt-in to new browsers features. If these websites are major, then they can certainly afford the dev resources to do so.

It's my understanding that a large part of the spirit of the web is that backwards compatibility is baked-in. Have a really old page from 1996? It should still work. Have a new webapp in 2017? That should work too.

This change can break pages and webapps that rely on touch events for the benefit of making sites with video, like CNN, faster. I don't think it's a fair trade-off to put speed over correctness --- regardless of how nice the speed is.

Please reconsider making this an opt-in feature rather than an opt-out one. The feature itself is a good idea, but I strongly disagree with changing well-established, default behaviour.

RByers commented 7 years ago

@maxkfranz that's an argument against all interventions, not just this one. There's legitimate debate to be had here, but let's keep it to #43 rather than spread across each individual intervention issue.

ExtAnimal commented 7 years ago

Why?

Why change the API like this?

It would be OK if availability of the object form of third argument was sniffable, but it's not. We can't know when to pass {passive: false, capture: useCapture} or just useCapture

So we can't prevent touchstart events from blurring by calling preventDefault.

(╯°□°)╯︵ ┻━┻

tdresser commented 7 years ago

You can feature detect, to determine whether to pass the third argument an object (though it's a bit clumsy).

See the EventListenerOptions explainer.

RByers commented 7 years ago

Though hopefully it's exceedingly rare to need to opt-out of passive touch listeners. touch-action is the simplest work-around in 99% of use cases.

Also the exact API design came from months of contentious standards debates - see here and the other links here for most of the history if you really want to know "why" the EventListenerOptions API has the design it does.

maxkfranz commented 7 years ago

@tdresser Yes, you can feature detect. But feature detection is useful only if you can change all affected code in your app.

If you're the only one using listeners, then you're OK. If you're using a lib that uses the addEventListener() function and it doesn't have a workaround specifically for this deviation from previous, standard behaviour, then you're really stuck.

@RByers To quote a previous comment about this: The main concern regarding Cytoscape is that the lib is used in many places, in both academia and commercial organisations, not all of which can easily update their code. Having support for passive events in Chrome is great but unfortunately, changing the default behaviour breaks our lib and many apps that depend on it.

I notice the following issue by @RByers in particular: https://github.com/WICG/EventListenerOptions/issues/38. That seems much more sensible to me. Disabling preventDefault() (i.e. passive: true) for touch events should be opt-in, and a separate lib can facilitate making passive: true default just as you describe --- without breaking existing apps.

rjgotten commented 7 years ago

If you're using a lib that uses the addEventListener() function and it doesn't have a workaround specifically for this deviation from previous, standard behaviour, then you're really stuck.

I mentioned the same thing before. But for some reason some Google engineers seem to be deaf to this genuine software compatibility problem and live in a la-la land where developers have the luxury of being able and allowed to fork and patch any such library they're stuck using.

I don't terribly mind that passive would be the default behavior, but at the very least we need an easy one-shot method of turning this intervention off when compat problems arise.

maxkfranz commented 7 years ago

I don't terribly mind that passive would be the default behavior, but at the very least we need an easy one-shot method of turning this intervention off when compat problems arise.

I think you have good intentions here: New apps you are making or apps you have direct control over could use a method like you propose.

My opinion is that's not enough, because apps or sites that won't be updated would still be broken. I don't think that's a fair trade-off.

rjgotten commented 7 years ago

My opinion is that's not enough, because apps or sites that won't be updated would still be broken. I don't think that's a fair trade-off.

Then perhaps Google should themselves maintain a white-list of websites on which this intervention could be enabled and disable it for all others.

tdresser commented 7 years ago

Thanks for the feedback. Most of these comments apply to interventions in general. As rbyers@ commented above, there's legitimate debate to be had here, but let's keep it to #43 rather than spread general feedback about the interventions across each individual intervention issue.

RByers commented 7 years ago

Since it's specific to the history of this issue, I'll re-iterate what I said on the Chrome bug here:

I'm deeply sorry for the frustration this has caused you. We've long tried the "opt-in" approach but have learned that on average developers don't make the effort to opt-in to better performance. In particular, in this case we started pushing passive touch listeners heavily [back in June] (https://developers.google.com/web/updates/2016/06/passive-event-listeners) including during the Google I/O Chrome keynote, and outreach to a large number of frameworks and other major sites would we knew could benefit. They almost all told us "can't you just detect this and do it for me automatically so I don't have to change my code?". As you can see in the graph here, we've had very little impact on real-world scroll performance via the "opt-in+outreach" approach.

So we believe that when only tiny number of sites are negatively impacted, and a huge number are positively impacted, we should prefer a "fast by default" policy instead.

We've done our best to do this in a responsible way - in discussion with all the browser vendors and standards groups, with an easy fix (touch-action), a full opt-out (though I admit it's not exactly easy), console warnings, developer outreach, and careful roll out via trials, dev and beta channel where were heard very little complaints. We need to work harder at this - eg. see #44. But in Chrome we're fundamentally unwilling to allow the mobile web to continue to die from performance bankruptcy. Other browsers are less aggressive, and people who prefer to be more conservative (preferring maximal compatibility over being part of moving the web forward aggressively) should prefer to use a more conservative browser.

Rycochet commented 7 years ago

Unfortunately very few people outside the browser development community have any interest in watching keynotes etc - and something like this is very abstract until it comes into effect.

Among other things, this completely breaks the ability of Chrome to be used to test mobile sites as it is completely non-standard behaviour that destroys any possibility of testing web apps without using something like BrowserStack etc.

The way this appears to a web developer:

  1. Bad websites take too long within touchstart events (without calling .preventDefault()) as the options parameter isn't supported in IE (assuming they even know about it), and don't want to use timer/rAF callbacks. End result is that initial touch scroll is delayed by a noticeable amount of time.
  2. Good websites / webapps want to prevent mobile scrolling so call event.preventDefault().

Possible solutions:

  1. Log warnings on websites that take too long (and possibly don't call preventDefault).
  2. Pop up an alert() style box for websites that do this (actually give some incentive to these popular sites such as in that video etc).
  3. Change things in an incompatible way (please explain how to use useCapture in a compatible manner without access to options) breaking webapps and websites that want to work on IE / Edge.

With this solution there's no incentive for sites to fix things. There's no pressure on MS to support options (even though I'd love to be able to use it, having to support older browsers means it's ~3 years or more away if they added support today). Older websites and apps are oing to break and users will have no clue why - the log spam will help developers - but that's assuming that there's a developer around to fix things.

rjgotten commented 7 years ago

But in Chrome we're fundamentally unwilling to allow the mobile web to continue to die from performance bankruptcy.

Yet you are not opposed to letting the weaker underbelly of sites on the web die out by willfully breaking those sites without anyone being available (or able) to fix them? Sites that might not even have performance issues, even...

Rycochet commented 7 years ago

@rjgotten I think the sites they're going to break are the ones that call preventDefault() in there - and almost by definition they won't have any performance issues to begin with...

rjgotten commented 7 years ago

@Rycochet Good point. Makes this decision all the more disturbing.

RByers commented 7 years ago

We know from metrics we've collected from the wild that the vast majority of touch listeners don't call preventDefault, and of those that do, the majority do so intermittently in ways we can't predict. Eg. it's common for a touch listener on the document to use jQuery event delegation such that preventDefault gets called only when the touchstart occurred over some element. In most of the cases where preventDefault would be called, the page isn't scrollable anyway so there's little visible breakage to scrolling not being disabled.

Can you give us the URLs of some sites that are seriously broken by this? I assume it's the specific change in #35 that we're talking about here. Over 15% of Chrome Android users now have that change and we've received almost no complaints from users about sites being broken.

It's definitely possible that we've made the wrong choice (the tradeoff between advancing the web and maintaining full compatibility is a constant challenge for all browser - we all expect to make some mistakes in both directions). But after years of debate on this sort of thing, we're far enough along that abstract arguments aren't particularly helpful. As this stage, the main things that could convince me that we've made the wrong decision for the web at large (rather than just for a tiny number of developers/sites who are personally impacted) are some or all of:

  1. If a large number of users filed/starred bugs providing URLs of sites that are broken. Eg. here's an example where we learned from users that (despite very promising metrics) our decision was wrong.
  2. We found some badly broken library/pattern that we can see (eg. via HTTP Archive) is highly prevalent on the web. Eg. see this issue where I've been holding back from fixing an annoying interop bug for years because 0.3% of top websites use a bad library that causes scrolling to break when the bug is fixed.
  3. An outpouring from many web developers arguing we're doing the wrong thing. I'm actually shocked that I can't find a single complaint on Twitter about this, in contrast to some other interventions which have still been judged to be worth the cost overall.
  4. Evidence from users that the benefit we're achieving isn't actually all that important to them. So far we've gotten a ton of feedback that users care about this.
  5. Examples of use cases where it's really impossible or unreasonably burdensome to fix sites to account for the new behavior. The "some sites can just never be updated at all" argument doesn't carry much weight with me because the only way to fully accommodate such sites on the web is to stop changing browsers at all (including stopping fixing browser bugs). Here's a recent example where we relaxed an intervention after feedback only from a single developer that a legitimate use case had become impossible.

I'm sure we're going to see more of the above over the next couple weeks (I'm shocked to have not seen more already). But as you can see from the other examples, we're definitely paying attention to this sort of feedback and will course-correct if necessary. But we're determined to be thoughtful and disciplined about the global tradeoffs here, not make rash strategically impactful decisions based on the vehemence of a few individuals.

Rycochet commented 7 years ago

The change has been made globally on desktop and mobile, only affecting touch events, so the metrics are only useful for people leaving them on (most developers I know turn them off specifically). Where desktop machines are getting more touch capable that means the machines with the processing power to not have any performance issues are still having this change (unfortunately I fully understand that any change has to be universal). This also means that the DevTools device emulation also has this change - which is where I found out about it trying to test a site using FabricJS (which looks like it would need to get a cross-browser fix, and the css touch-action fix is not possible as it's a library, so it would need to use browser-detection for consistent behaviour).

Unfortunately I can't personally provide URLs to any sites that I've developed hitting on this as they're education sites for Collins, Hodder and a couple of other smaller publishers (fortunately Pearson didn't need anything like this) - and all the content is under NDA and/or behind school portals that need access through their own sites directly. Currently that stands at ~10k individual sites (ie, not using shared JS/CSS files that can be easily fixed in one place).

I can say that the specific uses for this are for paged displays (not unlike book page turning with swipe etc), and for delegated dragging events - especially global ones via React, where you either attach to every element, or delegate and only attach to the top (might not be perfect, but 1x delegated handler can be far easier to manage than 100x direct handlers).

mawalker commented 7 years ago

As someone who only stumbled across this via a tweet and only occasionally does web development... I'd flip tables if this goes through and breaks an old page of mine(Because as someone has said backwards compat is one of the key features of Web). However, my biggest issue with it is inconsistency between browsers. It seems like RByers is very gung-ho over this change, but I'd argue that loading a popup to the user stating "We're sorry, this site's scrolling is slow due to them not following best practices, please report the issue to the site." would do enough to shame most major sites into fixing their code... while never breaking old sites.

Rycochet commented 7 years ago

@mawalker Something like this pseudo-code - if (timeTaken > 100ms) { (defaultPrevented ? console.log : alert)(" ... ") }

mawalker commented 7 years ago

Basically yes... However... honestly, I don't know if that would be 'good' to do or not... I know it is a bit of a heavy handed approach that some might think goes too far in (annoying) Alerting the user. But it would at least not break any sites and would promote sites (devs) to follow the best practices(Because they wouldn't want their site to get that annoying alert+have visitors complain to them about it).

I started to write this thinking that it might not be the best approach but after further thinking... since there is no -need- for the intervention (since a best practices work around already exists) I think it would be the better part of valor to not move ahead with the intervention and instead use discretion and fall back to an alert for slow scrolling sites(and if you don't want to alert users, then put into console.log the warning message explaining the site UX lag. This wouldn't notify most users but would let devs know what is the root cause(I'd put link to page explaining issue in console log to further help newbie devs (such as myself))).

mikesherov commented 7 years ago

@rbyers, ignoring whether or not it may be performant or difficult to do so, would it be possible to do some type of static analysis (or limited dynamic analysis) when a non-passive touch listener is bound to check if the Event Object is preventDefault'ed or passed to another function, and if so, assume preventDefault will happen?

mixonic commented 7 years ago

Improving the performance of the web is a worthy goal. However, breaking APIs and going against spec to get there seems misguided, IMO. Chrome "interventions" (breaking changes to the most widely used implementation of the web platform) are an immensely strong tool and should be used sparingly.

In this case and with the knowledge I've got, forcing passive on scroll-related events by default seems like a misstep. I'm going to keep this comment focused 100% on the intervention at hand.

Passive event listeners were added to Chrome in June. It has been less than a year since they appeared, and the API still isn't supported by Edge or iOS Safari (caniuse). I can sympathize with the comment:

average developers don't make the effort to opt-in to better performance

source

But that alone isn't enough of a reason to break existing sites. Indeed placing the blame for the lack of passive event adoption at the feet of lazy developers is not a productive way to approach the problem. A developer might not have adopted passive events listeners for many valid reasons:

There are a number of ways these issues would be addressed:

None of these things is as exciting as an intervention. All of them maintain the stability of the web platform, which is one of its most important assets.

Measuring the impact of an intervention definitely seems hard. In the Intent to Intervene email there are a few stats, like the fact that 2% of dispatched events are potentially affected. There is lots of discussion about tracking breakage, but no real numbers shared.

After a small amount of investigation I can share a few things I know are broken today.

It seems likely that a lot of sites involving drag/drop and the mobile web are negatively impacted. Not all, but many.

Why isn't touch-action: none sufficient to set on the the sortable-items css to fix the ember sortable issue?

source

touch-action: none isn't sufficient because this not only impacts me as a developer, it impacts me as a user. I cannot fix Gmail, that is certain.

So we believe that when only tiny number of sites are negatively impacted, and a huge number are positively impacted, we should prefer a "fast by default" policy instead.

source

"Fast by default" is a great policy, and one that should guide future API design and implemetation of web features. However, I don't think improving scroll jank should cost us breakage on drag and drop across the mobile web.

Finally, I want to question some of the "support" for this change that has been touted:

Thanks for all your work Chrome team. I do hope you reverse your decision here, and that you can find another way to balance vehemence for your performance goals with the greater goals of the platform and your project.

hexalys commented 7 years ago

I lack time to grasp the full scope of this intervention in details. In principle, for breaking existing sites, I am against a global flat-out reversal of spec behavior like this. However, here some quick thoughts for possible compromise or alternative approaches to this:

One relevant question is: In what primary context does this benefit users at large most?

With the CNN site example. My immediate assumption is that the vast improvement with forced passive touch listeners would seems to be mainly relevant to textual <main> content or <article(s)>. Things in headers, footers or any UI related elements outside such main content seems most likely not to benefit from passive by default, and rather be affected negatively.

If this observation has any merits. Maybe is there a way in which the fast-by-default behavior could only be enabled only inside such identifiable parent nodes, where it makes the most critical difference for users?

Rycochet commented 7 years ago

Has anyone done some profiling of the (bad site's) event code to try to figure out what's actually causing the slowdown problems and looked for consistencies there?

There are two no-change "fixes" I can see for the slowdown that site developers can use:

dmvaldman commented 7 years ago

Let's default var to let next.

rjgotten commented 7 years ago

@Rycochet Add the passive:true option manually (which isn't well supported, and prevents you from using useCapture on all the unsupported browsers).

Not true. There is a quite clean way to detect support for event listener options:

var hasListenerOptions = ( function() {
  var result = false;
  window.addEventListener( "options-test", null, Object.defineProperty({}, "capture", {
    get: function() { result = true }
  }));
  return result;
}());

And from there it's a matter of engineering your site's code with an abstraction over addEventListener that can switch between using a listener options object, or the plain boolean for capture.

Actually; for browsers that support addEventListener it should afaik be possible to write code that supplants the built-in version of the method. So in theory, it should be possible to write a globally applied patch to those built-in methods that do not support event listener options objects with an additional layer that, when given an event listener options object, pulls out the value of the capture property and passes only that along as a boolean parameter.

Theoretically, that even gives you a drop-in solution, though ofcourse; this comes with its own performance trade-offs again.


Come to think of it: I could totally see someone develop a globally applied patch like this to undo the damage this intervention would do. All it would need to do is add an explicit passive:false for all browsers that support passive listeners...

If I were a betting man, I'd put my money on that happening and being adopted as a drop-in by both large commercial sites on-the-cheap as well as by plugin-heavy CMSes for convenience , well before any of those would ever even consider integrating and rolling out the proper fixes. And once that genie is out of the bottle, it is never, ever going back in.

I myself won't stoop to the level of actually publishing the required code to sabotage this intervention à priori, but I can imagine others in the nay-camp may have less scruples about doing do, if only to underline the futility of this intervention.

Rycochet commented 7 years ago

https://gist.github.com/Rycochet/6ac0380841debbb65f78d36711a0dafa (public domain, so don't want to paste the long code including unlicense header in here).

Unfortunately this change got added so will be around for several months at least. I'd rather have safe code out there able to fix it, than everyone having to develop their own workarounds. If I get the time this week I'll wrap a cut down version of this into a Chrome extension for developers and users of un-maintained websites.

dtapuska commented 7 years ago

Measuring the impact of an intervention definitely seems hard. In the Intent to Intervene email there are a few stats, like the fact that 2% of dispatched events are potentially affected. There is lots of discussion about tracking breakage, but no real numbers shared.

Did you look at this document that was linked in the intent to intervene? What more details can I provide? I checked the stats and they are stable with 2% of page visits on Android still preventDefault with no touch-action.

Kendo UI, at least the version on their demo pages. demo

This isn't actually broken since the event listener is on the target (a div). What you are seeing in your video is the fact that devtools sometimes causes weird scenarios when touch is enabled without refreshing the page. If you refresh the page once enabling emulation mode you should see it works fine. If you are reproducing specific touch issues I recommend using a physical touch device as there are some weird side effects due to emulation.

There are a number of ways these issues would be addressed: More evangelizing. Warnings in the console (without breaking changes). Warnings in devtools (as Chrome shows w/ forced layout) Contributing to libraries, adapting them to the new APIs

All of these things we have tried since June last year. What is the recommended channels we should use to reach you? What evangelizing would you have liked to see that we didn't do to get your attention before this breakage? Have you tried Lighthouse. I think one problem is that if there is no developer how does a site know something else might be breaking. For document.write there is a proposal to send a HTTP header so the server can track issues but again this requires someone actively monitoring logs.

touch-action: none isn't sufficient I should re-iterate that using touch-action is preferred as it declaratively indicates what sections of the page don't want scrolling to happen on. Since this can be done entirely on the compositor thread we know that not to do scrolling when someone interacts with one of those regions. Adding "passive: false" causes us to think the whole document is slow and can be subject to the touch-ack timeout and/or main thread responsiveness interventions.

We are trying to make the touch events much easier to reason about. Specifically the touch-ack timeout but doing things base on time on devices varies a lot. The top end of Android phones can be an order of magnitude faster that the shipping phones at the low end.

maxkfranz commented 7 years ago

I've created an npm package, normify-listeners, to fix the breaking API change in Chrome 56.

There's also a higher-level package, normify, that can be used to pull in multiple packages. The idea is that if other issues come up in future in whatever browsers, then individual packages can be built around those specific issues. A dev can just call normify() to get all the fixes.

The package fixes the browser forcing passive: true by default, and it is flexible enough to work in future if more events are made passive by default in Chrome (e.g. wheel). It also makes it so you can use the options object with capture and passive in old browsers that don't support options. Old browsers will effectively ignore passive. This means you don't have to test for options support in all your code and the libs you use.

Of course I would prefer that Chrome followed the W3C spec for passive: false as the default, but workaround packages seem to be the only pragmatic alternative left to devs like myself.

I hope that packages like this one don't help to create the precedent that browser vendors can unilaterally break with standards with the expectation that devs will create workarounds. Browsers seemed to be really moving forwards by following standards better (even Microsoft's browser), but this change really feels like a step backwards.

dtapuska commented 7 years ago

@Rycochet

Unfortunately I can't personally provide URLs to any sites that I've developed hitting on this as they're education sites for Collins, Hodder and a couple of other smaller publishers (fortunately Pearson didn't need anything like this) - and all the content is under NDA and/or behind school portals that need access through their own sites directly. Currently that stands at ~10k individual sites (ie, not using shared JS/CSS files that can be easily fixed in one place).

How do you have this interoperable on IE and Edge since they don't send touch events on desktop. If you have support for PointerEvents then perhaps your user-agent check for pointer events is a little incomplete? We have long pondered if we really should follow the same model Microsoft does with only supporting touch events on the mobile platform and not the desktop platform.

dtapuska commented 7 years ago

If this observation has any merits. Maybe is there a way in which the fast-by-default behavior could only be enabled only inside such identifiable parent nodes, where it makes the most critical difference for users?

This is precisely what we've tried in this intervention. It currently limits listeners bound to the document, window and body to change the default passive behaviour.

mikesherov commented 7 years ago

This is precisely what we've tried in this intervention. It currently limits listeners bound to the document, window and body to change the default passive behaviour.

@dtapuska, is there any other way to limit this further? That is check the body of the listener function to see if it either returns false, calls preventDefault on the event argument, or passes the event argument to another function?

Rycochet commented 7 years ago

@dtapuska Effectively a single delegated method called for both mouse and touch events. The first line effectively checks which type it is and gets the correct coordinates, then it calls preventDefault to stop the event from getting passed on (to "click", or "mouse*" if it's touch) as it does all the handling internally. There is absolutely no need to check for pointer-events if I'm treating everything the same (and the UI need to behave identically on both touch and mouse if you're targeting children aged 4+). This bug just means that the handlers will get called twice, breaking everything that expects it to be called once and cancelled, and using 2x the CPU (just glad that I don't write laggy code to begin with).

Interestingly this also means that any site that delegates against anchor clicks (ie, lightbox modal style) will no longer be able to stop the links from opening - meaning until they've patched to stop it, navigation is slower.

dtapuska commented 7 years ago

@Rycochet have you tried adding a touchend handler? Cancelling the touchend handler prevents the click event from being generated from touch.

Rycochet commented 7 years ago

@dtapuska ...So totally ignoring my first paragraph then I guess. Just checking what you're asking: "Why didn't you add something that's not needed as per the standards to prevent something that was already prevented by following the standards?".