WICG / webcomponents

Web Components specifications
Other
4.38k stars 371 forks source link

Support `>>>` combinator in static profile #78

Closed hayatoito closed 6 years ago

hayatoito commented 9 years ago

Title: [Shadow]: Figure out a good replacement for /deep/ in testing scenarios (bugzilla: 28591)

Migrated from: https://www.w3.org/Bugs/Public/show_bug.cgi?id=28591


comment: 0 comment_url: https://www.w3.org/Bugs/Public/show_bug.cgi?id=28591#c0 Dimitri Glazkov wrote on 2015-05-01 17:22:53 +0000.

One thing that immediately popped up once we started talking about removing shadow-piercing combinators is that WebDriver-based tests are frequently interested in reaching into shadow trees to grab a specific element to test:

https://code.google.com/p/chromium/codesearch#chromium/src/chrome/test/chromedriver/test/run_py_tests.py&sq=package:chromium&q=testShadowDom*&l=877

Web Driver actually has something currently that attempts to solve the problem: http://www.w3.org/TR/webdriver/#switching-to-hosted-shadow-doms

However, the feedback I got from ChromeDriver folks is that it's way too verbose and awkward to use for the most common case (see above).

Maybe the solution is just specifying deepQuerySelector for WebDriver spec. However, I want to make sure this is not just a one-off -- seems like this could be needed in other contexts.


comment: 1 comment_url: https://www.w3.org/Bugs/Public/show_bug.cgi?id=28591#c1 Anne wrote on 2015-05-02 07:34:03 +0000.

1) We shouldn't do features for testing. That's bad.

2) I remain convinced that in the open case we should provide a myriad of features that cross the "deep" to aid with selection, event delegation, composition, etc.


comment: 2 comment_url: https://www.w3.org/Bugs/Public/show_bug.cgi?id=28591#c2 Elliott Sprehn wrote on 2015-05-03 00:41:03 +0000.

(In reply to Anne from comment #1)

1) We shouldn't do features for testing. That's bad.

2) I remain convinced that in the open case we should provide a myriad of features that cross the "deep" to aid with selection, event delegation, composition, etc.

+1, we should keep /deep/ in the static profile for querySelector. Before we had it authors kept rolling their own (we saw this on multiple occasions).


comment: 3 comment_url: https://www.w3.org/Bugs/Public/show_bug.cgi?id=28591#c3 Anne wrote on 2015-05-04 06:12:37 +0000.

Note that an alternative is that we introduce .deepQuery() or some such.


comment: 4 comment_url: https://www.w3.org/Bugs/Public/show_bug.cgi?id=28591#c4 Elliott Sprehn wrote on 2015-05-04 06:21:02 +0000.

(In reply to Anne from comment #3)

Note that an alternative is that we introduce .deepQuery() or some such.

deepQuery is not enough, you don't want to match a descendant selector across a ShadowRoot boundary since ".a .b" means something really different. You'd still need a special combinator to signal where the scope crossing should be in the selector expression.

ex. .panel .image

All images inside panels contained in a single scope.

.panel /deep/ .image

All images anywhere below a panel, even if they're inside a nested widget.

This is important because it maintains the "don't accidentally cross a boundary" principle.

We need something like ::shadow as well.


comment: 5 comment_url: https://www.w3.org/Bugs/Public/show_bug.cgi?id=28591#c5 Tab Atkins Jr. wrote on 2015-05-05 00:12:16 +0000.

(In reply to Elliott Sprehn from comment #4)

(In reply to Anne from comment #3)

Note that an alternative is that we introduce .deepQuery() or some such.

deepQuery is not enough, you don't want to match a descendant selector across a ShadowRoot boundary since ".a .b" means something really different. You'd still need a special combinator to signal where the scope crossing should be in the selector expression.

ex. .panel .image

All images inside panels contained in a single scope.

.panel /deep/ .image

All images anywhere below a panel, even if they're inside a nested widget.

This is important because it maintains the "don't accidentally cross a boundary" principle.

Yeah, trying to move the shadow-crossing quality to the core of the method doesn't work. It's much less flexible, as you note, and doesn't compose with anything else similar. The correct approach is to just embrace the "static profile" of selectors http://dev.w3.org/csswg/selectors/#static-profile and leave /deep/ there. (Or >>>, as it's now called.)


comment: 6 comment_url: https://www.w3.org/Bugs/Public/show_bug.cgi?id=28591#c6 Hayato Ito wrote on 2015-05-07 08:43:56 +0000.

(In reply to Tab Atkins Jr. from comment #5)

(In reply to Elliott Sprehn from comment #4)

(In reply to Anne from comment #3)

Note that an alternative is that we introduce .deepQuery() or some such.

deepQuery is not enough, you don't want to match a descendant selector across a ShadowRoot boundary since ".a .b" means something really different. You'd still need a special combinator to signal where the scope crossing should be in the selector expression.

ex. .panel .image

All images inside panels contained in a single scope.

.panel /deep/ .image

All images anywhere below a panel, even if they're inside a nested widget.

This is important because it maintains the "don't accidentally cross a boundary" principle.

Yeah, trying to move the shadow-crossing quality to the core of the method doesn't work. It's much less flexible, as you note, and doesn't compose with anything else similar. The correct approach is to just embrace the "static profile" of selectors http://dev.w3.org/csswg/selectors/#static-profile and leave /deep/ there. (Or >>>, as it's now called.)

Is there any existing clients who use static-profile? Does it mean '/deep/' can be used in particular APIs?


comment: 7 comment_url: https://www.w3.org/Bugs/Public/show_bug.cgi?id=28591#c7 Tab Atkins Jr. wrote on 2015-05-07 15:55:04 +0000.

(In reply to Hayato Ito from comment #6)

Is there any existing clients who use static-profile? Does it mean '/deep/' can be used in particular APIs?

It's for querySelector()/etc.


comment: 8 comment_url: https://www.w3.org/Bugs/Public/show_bug.cgi?id=28591#c8 Hayato Ito wrote on 2015-05-08 02:25:11 +0000.

(In reply to Tab Atkins Jr. from comment #7)

(In reply to Hayato Ito from comment #6)

Is there any existing clients who use static-profile? Does it mean '/deep/' can be used in particular APIs?

It's for querySelector()/etc.

Thanks.

Can everyone agree that '/deep/' is okay to be used in querySelector()?

I think we are assuming that adding something to static profile is zero-overhead to the performance of dynamic profile.


comment: 9 comment_url: https://www.w3.org/Bugs/Public/show_bug.cgi?id=28591#c9 Tab Atkins Jr. wrote on 2015-05-08 17:29:16 +0000.

(In reply to Hayato Ito from comment #8)

(In reply to Tab Atkins Jr. from comment #7)

(In reply to Hayato Ito from comment #6)

Is there any existing clients who use static-profile? Does it mean '/deep/' can be used in particular APIs?

It's for querySelector()/etc.

Thanks.

Can everyone agree that '/deep/' is okay to be used in querySelector()?

I think we are assuming that adding something to static profile is zero-overhead to the performance of dynamic profile.

Correct. At worst, it's a check during grammar verification, to note that this isn't valid in the current context and so the selector should be considered grammar-violating.

pemrouz commented 9 years ago

This seems like quite a basic affordance, without which there is an unbridgeable gap between the Chrome and Firefox implementations (utilise/all#1 | doc). We rely quite heavily on this ability. It looks like everyone is on the same page too (i.e. good in JS, bad in CSS). Is there any chance /deep/ or >>> in querySelector/All could be included in v1?

hayatoito commented 8 years ago

I renamed the subject of this issue to more appropriate one.

hayatoito commented 8 years ago

AFAIK,

Please feel free to correct me if I am wrong.

pemrouz commented 8 years ago

Thanks, @hayatoito.

Are there any more updates on this issue?

For anyone opposing this: is there a sensible alternative solution/proposal (perhaps something like :has-shadow-root?) so we can recursively crawl (at least open) shadow roots, rather than naively checking every element?

/cc @annevk @rniwa @travisleithead

rniwa commented 8 years ago

is there a sensible alternative solution/proposal (perhaps something like :has-shadow-root?) so we can recursively crawl (at least open) shadow roots, rather than naively checking every element?

What are use cases that require walking across all those shadow trees?

pemrouz commented 8 years ago

Essentially, we have a small core that stores (browser-like) resources and emits an event when they change.

ripple(name)       // getter
ripple(name, body) // setter
ripple.on('change', (name[, change]) => {})

Other modules like the view layer simply listens for changes and updates anything on the page that is affected. For example, components are simply transformation functions of data, so registering a new version of a component would redraw all instances of that custom element. This amounts to the following:

ripple.on('change', name => document.querySelectorAll(name).map(d => d.draw())) 

Other types of resources may also change, such as stylesheets or data, and the view layer would use different selectors ([css~="${name}"] and [data~="${name}"]) to find all the instances that need updating.

Now, since there may be relevant custom elements to update in the shadows of other custom elements (majority case), instead of document.querySelectorAll(name) a small utility is used to search across all shadow roots if the browser supports it.

Being able to use querySelectorAll to identify matching elements to update across the application results in a super-eloquent solution that lets the browser handle this robustly. Without this basic affordance, you would need to spawn your own DOM-like structure to keep track of things which would be very painful and kill interoperability.

rniwa commented 8 years ago

Now, since there may be relevant custom elements to update in the shadows of other custom elements (majority case), instead of document.querySelectorAll(name) a small utility is used to search across all shadow roots if the browser supports it.

But anyone could add a new instance of a custom element dynamically to any shadow tree so you'd end up continuously monitoring mutations to every shadow tree. This is extremely inefficient. Just walking the entire composed tree to find all instances of a given custom element alone is too expensive.

If you're concerned about tracking all instances of custom elements, wrapping custom elements' lifecycle callbacks might be a better approach (i.e. track lifecycle callbacks in your framework, and then delegate back to each custom element later).

pemrouz commented 8 years ago

There is no need to watch for mutations. You just call .draw on the newly added element. For browsers that support the Custom Element lifecycle functions, the attachedCallback is pointed to the same function so it wouldn't be necessary (it's idempotent, so there's no harm if it's called multiple times anyway though).

The lifecycle callbacks are wrapped, which is what allows for the dynamic registry of components in the first place. The problem is not with the lifecycle callbacks however. It's when something else changes, like receiving new data/stylesheets/components from the server, and then you need to find the elements to rerender. If you are suggesting to use the {attached,detached}Callbacks to build some sort of internal map and keep track of every instance, then that is exactly what would be nice to avoid and rely on querySelectorAll for.

rniwa commented 8 years ago

If you are suggesting to use the {attached,detached}Callbacks to build some sort of internal map and keep track of every instance, then that is exactly what would be nice to avoid and rely on querySelectorAll for.

I'm saying that providing such an API is undesirable because looking for all elements in the composed tree with querySelectorAll would be extremely inefficient and expensive.

A better approach would be for each component to keep track of their subcomponents within its shadow tree (e.g. at construction time), and recursively call draw on them. Alternatively, adding some sort of a option to MutationObserver so that it can track new elements of a given type being added to a shadow tree might work.

pemrouz commented 8 years ago

I understand, but either way we are searching for instances of elements across the composed tree. The suggestion to implement this in JS would be (a) orders of magnitude more expensive/inefficient than if done natively (b) a lot of unnecessary management overhead in all hot code paths (c) hard to do reliably (d) more assumptions, worse for interoperability and the different contexts components can work in (e) requires redrawing everything from the top-level component down for every change.

One of the reasons it's currently so fast (2x optimised React on DBMonster) is because (a) there are no intermediate data structures to manage and (b) we can jump immediately to a lower part of the tree and then recursively draw from there, rather than always starting right at the top.

This solution is also more in the spirit of progressive enhancement (in contrast to e.g. extending MutationObservers) and even if every framework built their own shadow-dom-walker-tracker (itself undesirable), being able to easily walk shadows is a more generally common use case, like the testing issue in the original thread, hence why it makes sense to favour the more generic solution.

rniwa commented 8 years ago

[B]eing able to easily walk shadows is a more generally common use case, like the testing issue in the original thread, hence why it makes sense to favor the more generic solution.

The whole point of shadow DOM API is to provide encapsulation. If your app needs to constantly break that encapsulation, then it's probably better not to use shadow DOM API in the first place.

In terms of testing, I don't think reaching into shadow trees of a component while testing a web page that uses a given component is a common scenario. You'd typically unit-test each component separately, and the integration test would be testing the page using public API of each component.

In fact, I'd argue it's an anti-pattern. For example, if a library author published a component and various websites that use that component started poking into its shadow tree in websites' tests then the library author may no longer be able to change its shadow tree structure at his/her will because those changes may break those websites.

pemrouz commented 8 years ago

I expected this more ideological argument and hence tried to sidestep (with "at least open"). Let's clarify two general use cases for Web Components:

(1) Componentisation of applications: For example, using a fractal architecture where each component composes other subcomponents structured in the same way

(2) Embedding third-party leaf components (e.g. google-maps, facebook-like, twitter-counts, etc)

Open shadows are suitable for the former and closed shadows for the latter (completely bullet-proof encapsulation). You still want to be able to benefit from things like style encapsulation, but there is more involved with application-level components, like state management.

It appears you are being dismissive of the former use case, in which case Web Components becomes relegated to just being a poor man's iframe (iframe is arguably better for the latter use case, since they are far more "strongly encapsulated": e.g. separate component registries, @font-face, etc).

For testing, you are again extrapolating your conclusion from one narrow case: I agree relying on the implementation of a third-party component would be brittle. But this isn't the only use case. It's reasonable to dispatch an event in one part of the application and make assertions about what changed on the other side. The view that everything should be controllable and observable from the top-level component's public API of the component is obviously impractical and brittle (every subcomponent has to proxy everything).

Hence I think it is worth first explicitly clarifying your view on open shadows and the former use case. Is the design goal of shadows to simply maximise encapsulation so that they are not useful throughout an application and are only useful as third-party leaf nodes? If yes, then open shadows should be dropped and this purpose made more clear. If not, then the need for some sensible way to recursively walk shadows (until hitting a closed one) becomes evident and we can go back to discussing whether APIs like this are more suitable in JS or native.

rniwa commented 8 years ago

You still want to be able to benefit from things like style encapsulation, but there is more involved with application-level components, like state management.

I've been using shadow DOM to write our performance tracking tool, and many of "components" have to track dozens of states. Yet, I've never found it useful having to reach into some other component's shadow tree in order to get access to those states. In practice, all those states are stored in some sort of JavaScript objects and properties, and components should be exposing some public API to access those states.

It's reasonable to dispatch an event in one part of the application and make assertions about what changed on the other side.

In such tests, there should be a test that verifies a given component dispatches an event, and a separate test which verifies that the given component responds to the event. You could even manually inspect those components' shadow trees during testing but that doesn't require anything akin to querySelectorAll since you're unlikely to be testing every single component simultaneously.

The view that everything should be controllable and observable from the top-level component's public API of the component is obviously impractical and brittle (every subcomponent has to proxy everything).

There is composed option in EventInit that makes the event propagate across shadow boundaries. That should be sufficient for that kind of inter-component communications. Using /deep/ to do that kind of inter-component communication is precisely the anti-pattern we want to discourage.

Is the design goal of shadows to simply maximize encapsulation so that they are not useful throughout an application and are only useful as third-party leaf nodes?

I disagree with your premise that maximizing encapsulation is not useful for components within an application given my experience of having successfully writing a web app that's made up of dozens of closed shadow tree components.

Having said that, Apple's position has always been that open mode of shadow DOM is a mistake. The only reason we have "open mode" shadow tree is because we made a compromise to keep both modes without a default during April 2015 F2F meeting for the sake of moving shadow DOM API as a whole forward.

pemrouz commented 8 years ago

I'm not sure what exactly your perf tools do: I agree that explicitly proxying and transforming state is a good thing. However, it's not the only use case. Having built multiple apps in this manner, the issue is that this approach does not scale. When you have a component 10 layers deep and you need to change it's requirement, every component in the path in between now has to explicitly proxy the new data down, which makes this approach very brittle.

This is not about "inter-component communication" (?). Only events are used for that in a "data down, actions up" paradigm. This is about either a deep or recursive API being necessary for the framework layer to manage updates. It might be beneficial to read more around things like Redux + React (but as aforementioned, it's not strictly just data, see for example Om Next (30:36 onwards) where they maintain a live index of all instances on the page).

The composed event is very relevant here: it's the symmetrical counterpart to this feature. Specifically, it enables dispatching an event which may pass through many layers all of whom do not need to explicitly re-emit the event. It's "upward piercing".

In such tests, there should be a test that verifies a given component dispatches an event, and a separate test which verifies that the given component responds to the event. You could even manually inspect those components' shadow trees during testing but that doesn't require anything akin to querySelectorAll since you're unlikely to be testing every single component simultaneously.

It appears you are erasing the problem here by suggesting to abandon integration tests for unit tests (two separate unit tests)? Testing end-to-end flows in an application happens across components. The introspection required by the testing layer, similar to the framework layer, transcends the component tree and it can't be expected to use the public API of components for this purpose.

It's good to know Apple is opposed to open shadows. However, I think going forward, unless your goal is to sabotage the usability of open shadows, it's better to not mix the two problems. There's very little reason to oppose this hook for open shadows (especially given existence of upward piercing), and once you all work out how to make closed shadows more usable, then you can discuss deprecating open shadows altogether.

rniwa commented 8 years ago

This is not about "inter-component communication" (?). Only events are used for that in a "data down, actions up" paradigm. This is about either a deep or recursive API being necessary for the framework layer to manage updates. It might be beneficial to read more around things like Redux + React (but as aforementioned, it's not strictly just data, see for example Om Next (30:36 onwards) where they maintain a live index of all instances on the page).

The composed event is very relevant here: it's the symmetrical counterpart to this feature. Specifically, it enables dispatching an event which may pass through many layers all of whom do not need to explicitly re-emit the event. It's "upward piercing".

Oh I see. This is exactly one weakness I see with the current web components as well. But I don't think exposing shadow roots is the right solution for this. What we need is an event-like system that propagates things downwards instead of upwards. e.g. dispatchEventInSubtree which would fire an event on all elements under a given element. But I don't think using an event is the right construct here. I think a better approach would be adding a new custom element callback which gets some sort of a message, and that can then pass it down to other custom elements within its shadow tree with some sort of default delegation mechanism.

pemrouz commented 8 years ago

^ Both are great alternatives I haven't thought about before :+1:. If you have a small example of how their API's might look, I can try and give some feedback from the perspective of my use cases.

rniwa commented 8 years ago

Sorry, I don't have a time to design new API like that at the moment but I'm more than happy to revisit this in a couple of months.

TakayoshiKochi commented 8 years ago

Let me continue discussion on this thread, here's the summary so far.

The original proposal was, to allow '>>>' (shadow-piercing combinator) in the static profile only.

Chrome had /deep/ combinator for Shadow DOM v0, which could be used for both static/dynamic profiles but found that using it in CSS stylesheets (i.e. dynamic profile) caused significant performance issues, as it is often used for theme-like rules (e.g. body >>> .fancycheckbox { ... }), at every style recalculation the engine had to match the selector against all the nodes.

The idea was to limit this combinator for querySelector/querySelectorAll, which runs one-off. The answer to theme-like styling as of now is to use CSS custom properties, but no good solutions for JS other than recursively applying querySelector on each shadow root (example).

The concerns so far are

If Shadow DOM is closed mode-only, this combinator of course doesn't make sense, but for open shadow trees, providing a way to select node in shadow tree should make sense.

One alternative idea from @annevk was to have .deepQuery() instead of the combinator, but what it matches against e.g. .deepQuery('.a .b') and .querySelector('.a >>> .b') is different. In .deepQuery() all normal descendant combinator mean deep descendant, or .a and .b have to be in the same tree? It could be either way, but I guess explicit distinction is better (normal descendant vs. deep desendant).

TakayoshiKochi commented 8 years ago

I think the current situation is like when we had to include jQuery as platform didn't provide querySelector, so wouldn't it be nice if the platform provides the API to select nodes deep into open Shadow DOM, without having the recursive polyfill.

Any comments and opinions are welcome!

hayatoito commented 8 years ago

Please remove "breaks encapsulation of Shadow DOM" from the concerns because >>> must not cross closed shadow roots in any sense.

An open shadow root already provides Element.shadowRoot. So the use case of >>> should be the same for Element.shadowRoot, basically.

The point is that >>> in the static profile is just a utility feature which can be polyfill-able by JavaScript. However, having >>> in the Web Platform makes the Web faster, and reduces the burdens of web developers.

>>> does not provide any value for closed shadow roots.

TakayoshiKochi commented 8 years ago

"The concerns so far" is the summary in this thread, and IIUC the main point that Apple opposes having this is breaking the encapsulation, as in https://github.com/w3c/webcomponents/issues/78#issuecomment-222884679.

Thanks for the clarification.

hayatoito commented 8 years ago

Yeah, I would like to clarify that >>> in the static profile is just a syntax sugar for Element.shadowRoot and querySelector APIs.

The point is that having this syntax sugar in the Web Platform is worth while or not.

tabatkins commented 8 years ago

Yes, @hayatoito is exactly right. >>> is nothing more nor less than sugar for the .shadowRoot. Anne's older .deepQuery() idea is strictly weaker, since you can't control when you pierce through roots, plus

rniwa commented 8 years ago

Given we have a fundamental disagreement with the usefulness of shadowRoot in the first place, I don't find that argument convincing at all. What we need is a concrete list of use cases (there were a couple of good ones earlier in the thread), and we need to see if /deep/ is the right API to provide for those use cases or not.

For example, propagating Rect-render-like states to subcomponents is NOT an appropriate use case for /deep/ since it won't be one off case to bypass encapsulation. This use case is better met by a new custom element callback designed to propagate states top-down.

Writing integration tests for a Web app consisting of hundreds of components is another use case that came up. I'd argue that such an integration test should not be relying on internal states of each component. Instead, they should be testing based on each component's public interface. However, if authors really wanted to do white-box testing across components, they could simply write an equivalent of /deep/ in JavaScript. And this would be fine for testing scenarios because an extra runtime during testing wouldn't hurt end users.

matthewp commented 8 years ago

@rniwa I don't know specifically about /deep/ or equivalents, but one use case that I have for the shadowRoot property is to server-render a component (using a headless browser or something similar) so that the shadow DOM is part of the light DOM initially and then re-inserted into the shadowRoot when components are upgraded.

TakayoshiKochi commented 8 years ago

Well, deciding whether to use closed or open shadow root is up to component author, and this is strictly for open shadow components. There is a high chance of the need to tweak third-party open component.

@annevk @travisleithead any opinions?

rniwa commented 8 years ago

There is a high chance of the need to tweak third-party open component.

How does tweaking third-party open component require using /deep/? What's a concrete example of such a scenario?

annevk commented 8 years ago

@TakayoshiKochi I think I agree with @rniwa that it's better to stay a bit conservative until the feature is really justified.

pemrouz commented 8 years ago

@TakayoshiKochi I think I agree with @rniwa that it's better to stay a bit conservative until the feature is really justified.

What's the reason for needing to err on the side of caution here? Like what's the worst that might happen if developer's use this API? Since it's already been trialled, are there any actual negative examples? For open shadows, everybody agrees the API is fairly trivial (sugar) so it seems the default should be to keep it. On top of that, all the practical use cases raised here are pro-deep in static profile.

100% of @rniwa's comments are based on a conflation of open and closed shadows. Although I'm 100% agreed with experimenting and building something better for closed shadows, there's apparently no legitimate reason to block this for open shadows. A sharp distinction needs to be made between these two.

I think the only thing that will happen by excluding this is that the usability of open shadows and adoption of Web Components will be hampered. The Web Component spec does not currently have a great story for applications. Until all the pieces are there for hard encapsulation, it would be nice to at least have the low-level primitives to allow wider usage of softer encapsulation in v1. Extrapolating reasoning only valid for closed shadows ("bypasses encapsulation") to open shadows would be purely ideological, especially without offering any alternative.

rniwa commented 8 years ago

What's the reason for needing to err on the side of caution here?

One reason is that the implementation cost. Implementing a shadow piercing combinator requires a significant implementation effort, which can be directed instead towards other efforts like adding light style mechanism to custom elements and/or one of thousands of other features being proposed.

100% of @rniwa's comments are based on a conflation of open and closed shadows.

Have you even read my comments? Nothing but the first paragraph in my earlier comment is related to shadow tree modes:

For example, propagating Rect-render-like states to subcomponents is NOT an appropriate use case for /deep/ since it won't be one off case to bypass encapsulation. This use case is better met by a new custom element callback designed to propagate states top-down.

Writing integration tests for a Web app consisting of hundreds of components is another use case that came up. I'd argue that such an integration test should not be relying on internal states of each component. Instead, they should be testing based on each component's public interface. However, if authors really wanted to do white-box testing across components, they could simply write an equivalent of /deep/ in JavaScript. And this would be fine for testing scenarios because an extra runtime during testing wouldn't hurt end users.

I think the only thing that will happen by excluding this is that the usability of open shadows and adoption of Web Components will be hampered.

Why would the lack of shadow piercing combinator hamper the usefulness of shadow trees if it were only used during testing and other kinds of extraordinary situations in which one has to bypass the encapsulation provided by shadow trees? What exactly are use cases in which this feature is required?

hayatoito commented 8 years ago

Thank you for the feedback, folks. I really appreciate that. I agree with @annevk 's comment. It's better to stay a bit conservative until the feature is really justified.

It looks we need an experiment and data so that this feature can be justified. Let me implement this feature in Blink behind a flag. I would like to know how this feature would help web developers if this is implemented behind the flag

I will not ship this feature in Blink until we reach consensus.

ChadKillingsworth commented 8 years ago

I've come to the conclusion that the testing frameworks need to change - not the spec. This is after hours of research and work. Here's a simple script for webdriver.io which allows full traversal of open shadow roots to select elements:

/**
 * Add a command to return an element within a shadow dom.
 * The command takes an array of selectors. Each subsequent
 * array member is within the preceding element's shadow dom.
 *
 * Example:
 *
 *     const elem = browser.shadowDomElement(['foo-bar', 'bar-baz', 'baz-foo']);
 *
 * Browsers which do not have native ShadowDOM support assume each selector is a direct
 * descendant of the parent.
 */
browser.addCommand("shadowDomElement", function(selector) {
  return this.execute(function(selectors) {
    if (!Array.isArray(selectors)) {
      selectors = [selectors];
    }

    function findElement(selectors) {
      var currentElement = document;
      for (var i = 0; i < selectors.length; i++) {
        if (i > 0) {
          currentElement = currentElement.shadowRoot;
        }

        currentElement = currentElement.querySelector(selectors[i]);

        if (!currentElement) {
          break;
        }
      }

      return currentElement;
    }

    if (!(document.body.createShadowRoot || document.body.attachShadow)) {
      selectors = [selectors.join(' ')];
    }

    return findElement(selectors);
  }, selector);
});

The problem is that testing frameworks are designed around the premise that you can use a query selector to match any element. With shadowDom this is simply no longer true. An alternative strategy is needed.

With webdriver.io, the lower level protocol bindings are still accessible. Using the above custom command I can fully test in browsers with native ShadowDOM support as well as polyfilled.

Closed shadow roots will require an entirely new strategy for testing however.

TakayoshiKochi commented 8 years ago

(edited) Discussion at TPAC 2016 f2f: although Google had some cases that people want to use '>>>' for querySelector/querySelectorAll, it was not enough evidence that every browser is convinced to implement the feature. Google will implement this feature experimentally, to collect more data.

minutes: https://www.w3.org/2016/09/19-webapps-minutes.html (search for '/deep/')

Pros:

Cons:

hayatoito commented 8 years ago

No. "proposal is abandoned" is not the conclusion at TPAC. I have not abandoned. The situation has not changed.

We couldn't reach consensus there, however, Google has not abandoned the proposal, and will implement this feature behind a flag to get more feedback, as I said in https://github.com/w3c/webcomponents/issues/78#issuecomment-239724568.

TakayoshiKochi commented 8 years ago

Thanks. Updated the summary above.

TakayoshiKochi commented 7 years ago

Blink implemented '>>>' under the experimental web platform features flag.

I tried to get a rough number how fast/slow the native implementation against emulation in JS: https://gist.github.com/TakayoshiKochi/04fe33f40543ce07bdc53ecdfb6cc2e0

The code creates 64-shadow roots(4x4x4)/2048-leaf elements tree, and search <span> elements in it. For 100 iterations, native implementation ran in ~30ms, while the emulated code required ~900ms on my local workstation. So roughly 30x fast.

But it seems the slowness comes from the fact that querySelectorAll() cannot find shadow hosts and the code has to iterate over all elements. If we had a fast implementation of looking up shadow hosts (e.g. querySelectorAll(':shadowhost')), the emulation could be much faster.

I added class for each shadow host and used the class in querySelectorAll to match them, then https://gist.github.com/TakayoshiKochi/a4385510e4f483a02dd0027eb215cfe1 the emulated code ran in ~120ms, which got much speedup, but is still 4x slower than native.

My takeaway (only in terms of performance) is that the native implementation gives enough speedup against polyfilled JS code. If a very fast lookup method for finding shadow hosts is invented, a generic polyfill might be tolerable, otherwise too slow to use.

annevk commented 7 years ago

I thought we were planning to make discovery of open shadow roots easier through a pseudo-class. Did that change?

rniwa commented 7 years ago

We’re experimenting with collectMatchingElementsInFlatTree (implemented in https://trac.webkit.org/changeset/208878) which finds all elements that match a given selector in each tree instead of an explicit >>> to cross a shadow boundary because we found that most use cases of this API involved looking for an element of a specific type (e.g. custom element, an element with a specific attribute, etc…) and didn’t warrant matching a specific tree structure across shadow boundaries.

hayatoito commented 7 years ago

Ah, that is an interesting idea. Do you have a plan to have a concrete proposal?

It would be like a element.querySelectorAll(simple-selector, { good_option_name: true }), wouldn't it?

rniwa commented 7 years ago

It would be like a element.querySelectorAll(simple-selector, { good_option_name: true }), wouldn't it?

Well, there is a subtle semantics difference. We’re essentially checking whether :matches(selector) is true or not for each element, and collecting elements that do match so we’re not scope-matching a selectors string but instead we’re matching a selector against an element which uses a different scoping object. This is because using a node from different trees as the scope doesn’t make sense in evaluating a selector as well as for the implementation simplicity.

We’re adopting this API in some Safari code for now, and if this turns out to be useful, we’d make a more refined proposal.

TakayoshiKochi commented 7 years ago

I thought we were planning to make discovery of open shadow roots easier through a pseudo-class. Did that change?

It seems the bottom half of https://github.com/w3c/webcomponents/issues/426 contains discussions for such a pseudo class, but the issue was closed as "If #468 is going well, we do not need additional pseudo classes" and "We can file a separate issue if we want new pseudo classes". No one yet has filed an issue.

For now, let's continue on this for the fate of >>>, collectMatchingElementInFlatTree, a pseudo class, or something new.

@rniwa any updates on collectMatchingElementInFlatTree?

rniwa commented 7 years ago

Unfortunately, we haven't had a time to fully adopt collectMatchingElementsInFlatTree in Safari code yet but we've just started working on this again this week. We'll let you know.

We think we need some mechanism to walk over the flat tree in browser extension code and scripts injected by web driver tests since many of these scripts already have to walk across cross-origin iframes to be useful in some cases, and they are sort of privileged scripts that can ignore other security constraints.

We're still extremely cautious and skeptical that such an API is needed for production website code since most of use cases we're aware of involves violating the very encapsulation layer shadow DOM aims to provide. While author can always bypass such a boundary in JS (e.g. each component can eagerly expose its ShadowRoot as a JS property even for a closed-mode shadow tree), we're very much afraid that lowering the barrier of such a violation by providing an selector-based API would undermine the very purpose of shadow DOM.

TakayoshiKochi commented 7 years ago

Thanks for the status update. The idea of privileged scripts sounds new to this thread. I'm afraid if such API is only for non-open-web product requirements (e.g. testing, inspection for debugger, extension code), it may go out of scope for here to continue discussion, though.

rniwa commented 7 years ago

I think it DOES make sense to spec an API like this for WebDriver. Also, it might be there is a compelling use case for open shadow trees. The problem is that we haven't had a really compelling use case for an API like this in non-testing environment.

equinusocio commented 7 years ago

\deep\ is deprecated in shadow dom v1 (that ship the standard ::slotted() and css style-hooks). This issue can be closed.

TakayoshiKochi commented 7 years ago

@equinusocio no, /deep/ is deprecated for "dynamic profile" (i.e. use in CSS styles), while here we are discussing any way for "static profile" (use from querySelector() API).

So far some options:

  1. Keep /deep/ and >>> for static profile
  2. Introduce an alternative API (e.g. collectMatchingElementsInFlatTree)
  3. Other?

The use case is to find some element in the entire flat tree, but the controversial point is the nature of /deep/ is too powerful against Shadow DOM's encapsulation.

The problem of not having this feature is that finding such an element in flat tree is inefficient in the current available APIs.

pemrouz commented 7 years ago

Well, there is a subtle semantics difference. We’re essentially checking whether :matches(selector) is true or not for each element, and collecting elements that do match so we’re not scope-matching a selectors string but instead we’re matching a selector against an element which uses a different scoping object. This is because using a node from different trees as the scope doesn’t make sense in evaluating a selector as well as for the implementation simplicity.

We’re adopting this API in some Safari code for now, and if this turns out to be useful, we’d make a more refined proposal.

@rniwa do you have any updates of how this is going in Safari?

phistuck commented 7 years ago

The use case is to find some element in the entire flat tree, but the controversial point is the nature of /deep/ is too powerful against Shadow DOM's encapsulation.

That sounds less like a use case to me, but more like a summary of how you would find an element... It would be better to provide a concrete example for a non-testing scenario where this would be needed, I think.

tabatkins commented 7 years ago

Look at any non-WC page that searches for elements in the page with querySelector/All(), for any reason.

Now, imagine they switched to make heavy use of WC, such that there was a nice hierarchy of shadow trees.

Boom, that's a use-case. (Unless you can reasonably argue that, in the new version of the page, they'll have no need to search for elements from the top level of the page anymore, and can rely on specific elements to do all the searching locally within their own tree. That would be a difficult argument to make, I think.)

And as a reminder, not having >>> in qSA() has nothing to do with safety or encapsulation. You can emulate its effects perfectly, if with some effort and cost, by manually parsing the selector, then tree-walking the open shadow trees and running selectors yourself. It only exposes information that is already freely exposed via DOM APIs.

And the alternative API proposal (querySelectorAgainstFlatTree) is weaker than >>>, because you can't control when you walk down a tree, and it can't reproduce the power of ::slotted().

rniwa commented 7 years ago

A use case is a concrete scenario / story. Imagining any page that doesn't use web components that use querySelectorAll to start using web components is too abstract and generic to be called as a use case.

As a counter example, I have a website (perf.webkit.org) with ~20k lines of JS/HTML/CSS code which didn't used to use web components and used querySelector and querySelectorAll. I've successfully switched to start using shadow DOM & custom elements without ever needing to find elements across shadow boundaries with querySelector or querySelectorAll.

As far as I can tell, using >>> to find elements across component/shadow boundaries is an anti-pattern because each component tends to have a separate internal state that needs stay in sync with its DOM tree. Mutating them directly would tend to result in each component's internal states getting out of sync with DOM nodes. For example, in apps that use React-like top-down reconstruction of DOM nodes & diff'ing, mutated DOM nodes may get thrown away and re-constructed next time the component renders itself. So for any DOM mutations made across component/shadow boundaries to make sense, the mutator must observe all DOM mutations across shadow boundaries, and re-apply the changes as needed. This leads to very inefficient, intrusive, and fragile code. That's exactly the kind of issues encapsulation should prevent.

Even for testing, the need for querySelectorAll never came up because we could just use JS functions to find the right element to do the necessary white box testing.

And the alternative API proposal (querySelectorAgainstFlatTree) is weaker than >>>, because you can't control when you walk down a tree, and it can't reproduce the power of ::slotted().

This is precisely why we like that approach better. The biggest problem we have with the proposed API is the we'd have to introduce a very complicated shadow boundary transitioning code into our selector matching engine.