w3c / csswg-drafts

CSS Working Group Editor Drafts
https://drafts.csswg.org/
Other
4.42k stars 651 forks source link

[css-nesting] Problem with mixing properties and selectors #8249

Closed plinss closed 1 year ago

plinss commented 1 year ago

As mentioned I'm opening a new issue to outline the objections I raised in the call today and to discuss paths forward.

My primary objection to the current syntax is that mixing properties and selectors without a clear distinction places unacceptable (to me) restrictions on the future design of the CSS language.

An implementation parsing a style declaration will need to presume anything that doesn't start with an identifier or function is a selector, which restricts our ability to ever extend property declarations with anything that doesn't start with an identifier or function. Furthermore this restricts our ability to ever extend the definition of an identifier or a function. As an example, if this were implemented first, we could never have added custom properties with the current syntax (which redefined identifiers).

Alternatively, we could limit selector combinators to the current set and a limit extension path, like /<name>/, this would place restrictions on future selector syntax and potentially add more confusion as to the rule of when a & or the :is() hack is required. Not a fan of this.

I see two paths forward (and welcome other suggestions):

1) We remove the lookahead restrictions on the parser and 'simply' adopt the SASS syntax. The lookahead restriction came about 25 years ago when there were real concerns that CSS would be too slow to ever implement in a browser and everything was focused on performance. I'd like to see some experimentation and real-world data to check that assumption and see if advancements in processor speed and RAM availability allow us to relax that.

2) We add something that clearly distinguishes selectors from properties within a style declaration.

Something like:

div {
  color: red;
  @nest span {
    color: blue;
  }

is fine by me, but I accept that this has been proposed in the past and rejected.

A compromise I'd be OK with would be treating a bare @ inside a declaration as the equivalent to @nest (and possibly allowing @nest to be optional for those wanting clarity). This is functionally equivalent to requiring the & (which many people in the polls preferred), but also handles cases where the & isn't the start of the selector without adding lookahead. e.g.:

div {
  color: red;
  @ span {
    color: blue;
  }
  @ &:hover {
    color: green;
  }
  @ section & {
    color: yellow;
  }
}

This leverages the fact that @ is already CSS's universal escape hatch and clearly distinguishes properties and selectors, allowing unrestricted extensions of either in the future. It also minimizes verbosity as the majority of nested selectors can simply start with an @ and requires no other changes or special rules to learn.

tabatkins commented 1 year ago

I think requiring authors to type &:is(div *) is dramatically worse than :is(div) &, fwiw. ^_^ It's certainly not easier to explain the transform necessary there. It's also certainly a worse interim option than :is(div) & in terms of "weird cruft that'll get leftover in code during the interim", which Alan wanted to avoid in the first place.

And again, the only reason to avoid things like .foo &, which the current spec allows and which has zero cruft in our ideal future, is if we think that we're going to throw out the last year of discussions and start with some new approach. We don't have any reason to believe that's the case (and I'm strongly opposed to it).

astearns commented 1 year ago

The other reason to postpone .foo & for now is so we do not have to explain that you can use that simple form for .foo but not for div.

As I have said before, my preference would be not to ship anything yet. But if something must be shipped I think it is reasonable to explore what subset would work best.

Alohci commented 1 year ago

I find it incomprehensible that :is(div) ... and .foo & should be disallowed (or postponed) on the grounds that we don't know whether infinite lookahead is possible or not, because it makes no difference either way.

If infinite lookahead is impossible then :is(div) ... and .foo & are viable indicators of a selector. if infinite lookahead is possible then :is(div) ..., .foo & and div would be viable indicators of a selector.

What is gained by forcing authors to apply a prefix onto all nested selectors?

plinss commented 1 year ago

Given the speculation about my position it's probably worthwhile for me to be a bit more clear where I stand.

All of the work on nesting up until recently has been done under the presumption that adding lookahead to the parser couldn't be done. To this end there has been a lot of work trying to find a compromise syntax that serves everyone's needs. There has obviously been a lot of work and a lot of creativity put into this effort. I'm not belitting that in the least, but at the end of the day the outcome is that we've fairly well determined that there isn't a solution without lookahead that meets all the requirements. At best we have a compromise solution that I've heard described as "the least worst solution" (which I actually disagree with, but more on that later).

The one thing we haven't spent any energy on (until the last couple of weeks), is exploring the solution of 'simply' relaxing the lookahead restriction. I feel there's broad consensus that doing so does allow us to deliver a nesting syntax that is an optimal solution and makes everyone happy, especially authors who are used to SASS.

The lookahead restriction is over 25 years old at this point, and serves no purpose except parsing perfomance. I worked on Gecko's first CSS parser and frankly I wondered at the time if that restriction was actually necessary at the time. However, it also wasn't in the way, so there was no need to challenge the assumption. There were also very strong concerns about CSS as a whole being implementable at the time. No browser had yet shipped a confomant implementation of CSS1, and rembember, we were working on ~100MHz 486 processors. The consensus within Netscape at the time was that CSS1 selector matching could never be made performant, so anything that impacted performance in CSS was treated with extreme suspicion. We obviously proved that consensus as wrong at the time. And since then we have also proved other 'performance killing' features like :has() turned out to be doable after all.

I strongly believe that lookahead will prove viable, and the work done on #7961 has identified several optimizations, making it even more likely.

My threat of a Formal Objection stems from the current desire to adopt and ship the compromise solution, without having properly explored and ruled out the optimal solution. I feel this would be a mistake and is not an example of the WG using the proper process to find the ideal solution.

So yes, if we try to finalize on option 3 as it stands, before knowing if the lookahead solution is actually possible or not, I will file an FO.

Secondly, I have issues with option 3 as currently defined, specifically the feature where prefixes to nested rules are optional if the rule doesn't start with an identifier or function.

If the lookahad approach proves viable, these issues are moot.

I have two technical resasons to object to that feature:

1) It is an author foot-gun. The intent of the feature is to allow author to copy/paste rules into nested context without modification. However, authors will have to carefully review those rules for those needing modification, by either a prefix or adding an otherwise unnecessary pseudo-class (which by any definition is a hack). I believe this feature causes more harm than good for this reason alone. Backing up that opinion, we have poll data from authors stating that over 50% of them will not use this feature. This is not an author-friendly feature and violates the proprity of constituencies.

2) It narrows the scope of possible future extensions to the CSS language. I'm well aware that some people think the narrowed scope is less of a problem than I do, but regardless, it is something that needs to be taken into account. I do not believe we have done so properly. Furthermore, given the limited utility of the feature, I feel benefit is too small to justify the cost.

I also have a process reason to object. The path to getting us where we are has been to narrow down a broad set of possible solutions. I believe that each decision along that path was made in good faith and with good intentions, however, not every one of those decisions was made with a complete understanding of the big picture. Specifically, the language restrictions were not made clear to many of the WG particiapants until very late in the process, and were not part of any of the public polls. Also, many approaches were ruled out based on opinions, without having fully explored the alternatives, so many of these decisions were made with incomplete information.

Furthemore, the last two times we made major desicions about nesting, it wasn't on the agenda. Adding last-minute discussions is fine, but it's not a good way to make major decisions. People did not have time to properly review information, and we have no idea how many people weren't present because they simply didn't know we were going to discuss this. For example, I looked at the agendas and almost didn't attend either meeting.

To this end, should lookahead prove to not be viable, I feel it is necessary to revisit the alternative approaches, taking the larger picture into account, and see if one of them isn't a better solution overall. I have heard too many times, "we already ruled that out" as an excuse to close down discussions that should have been had.

So, yes, if lookahead proves to not be viable, and the WG resolves to ship the current syntax as-is, without re-opening discussions on alternatives presenting all the possible options with a list of all of their costs and benefits, I will file an FO.

Given these explanations, I hope it's obvious that my threat of an FO isn't "stop energy" or me simply trying to bully the group into accepting my preferred solution because I dislike where we are. I genuinely feel that our normal process and work mode has broken down and we need to take a breath and reassess.

There are several paths forward that will not result in me filing an FO.

1) We simply wait a few weeks and get real-world data about the feasability of adding lookahead. Should it prove viable, we're done (modulo the orthogonal issues which I hope we can resolve in the meanwhile). This is my preferred approach.

2) Should the timeframe to evaluate lookahead prove to take "too long" (with some justification of what that means), we can ship a limited version of nesting without the contentious feature. To this end there have been a few proposals (note that the last two these are compatibe with being morphed into other approaches making the prefix optional):

a) Require a `&` somewhere in the selector. I feel this is a non-starter because it requires lookahead.

b) Require a `&` prefix on all nested rules and no other use of `&` within the rule. I'm not thrilled about this, but it woulnd't trigger an FO on my part. It can be fixed later if needed.

c) Require some other prefix, like `@nest` or simply `@`. This would be my preferred interim approach as the prefix can be made optional if lookahead ships later, and avoids the other issues if not. I really don't see `@ div` as any worse than `& div`.

3) Should lookahead prove non-vaible, we reasses and make a proper consensus decision, reevaluating alternative approaches. If we do this, and everyone acts in good faith, and we still come back to the current version of option 3, I will not file an FO.

So I'm adding an Angeda+ label and request we take up the following on a near-term call:

1) Recind the resolution made on January 11. I argued against that resolution at the time and still believe it serves no purpose other than to act as fuel for bad-faith arguments and shenanigans like this.

2) Resolve to wait for real data about the viability of lookahead before making decisions to adopt a compromise syntax.

I also want to make clear that the TAG has begun a review of Nesting. The TAG has had discussions, but due to scheduling constraints have not yet had a quorum to make a consensus decision. However, all of the TAG members who have discussed this so far are in agreement that the best path forward is to explore lookahead. There also has been no dissent on the viewpoint that the current approach is an author foot-gun and some kind of mandatory prefix, or alternative scope for nested rules, would be a superior approach for authors.

So in addition to my FO, be aware that it's likely the TAG will not have a favorable review of the design as it stands.

romainmenke commented 1 year ago

Chrome is shipping nesting in version 112 in ±3 weeks. Are there any updates on this issue, the research in Gecko, ...

romainmenke commented 1 year ago

Chrome is shipping nesting in ±2 weeks. Any chance of getting an update on this?

astearns commented 1 year ago

@plinss given https://github.com/w3c/csswg-drafts/issues/7961#issuecomment-1489883575 and the upcoming breakout session on nesting next week, can we reduce the scope of this issue before the call? I’d like to hash out as much as we can here in the issue ahead of time.

tabatkins commented 1 year ago

I'll let @plinss make the final say, but as far as I can tell the objections raised in the original comment are totally resolved by #7961. (This is the "Option 1" listed in the original comment.)

plinss commented 1 year ago

I can't say #7961 'totally' resolves my objections. But it does remove many of my issues with option 3, and I hope that option 3 is now formally off the table.

The look-ahead approach, while much better than option 3, still has some issues, namely:

While none of these are show-stoppers in an of themselves, we still have the alternative of using an at-rule (several of the proposals so far can even be supported in parallel). The advantages of that approach are:

The disadvantages are:

I still want the WG to take a step back and consider both approaches with fresh eyes, evaluating all the advantages and disadvantages of each. Provided we do that and come to a clear conclusion, I won't object to either path.

FWIW, my vote is still for an at-rule approach. It's not as pretty, but it's safe and easy to explain with no surprises down the road.

plinss commented 1 year ago

Also, regarding the lookahead approach, I just have to say: https://www.youtube.com/watch?v=xSTN3mHEAOA

astearns commented 1 year ago

@plinss while I have no love for the warts in option 3, I don’t think we are in a situation where we can take it off the table. I think it is much more likely that we have to accept the reality that it shipped (too hastily, before we had clear answers for feature detection among other issues, and I think this will need it’s own section on the CSS Mistakes wiki) and that the restart look-ahead plan merely allows us to improve the syntax in the near future.

You said above that

I'm also fine with shipping option 3 as-is, as soon as we know for a fact that look-ahead is viable, as an interim step until full lookahead can be implemented, because all the down-sides are temporary. (I personally don't think it would be worth shipping, but I wouldn't object to it.)

Are you now walking that back?

plinss commented 1 year ago

No, I'm not walking that back. The point of that remark was that if we found out that look-ahead was viable, but would take time to implement, then I'd be ok with option 3 as a short-term, interim step.

Now that we not only know that look-ahead is viable, but we already have an implementation of it, I see no reason for option 3 to remain in the spec. The fact that vendors have actually prematurely shipped a non-standard implementation is irrelevant. There's no reason to spec option 3 behavior, their existing implementations can remain non-standard.

So, yeah, I believe we can, and should take option 3 off the table and remove it from the spec now, one way or another. It can be replaced with either the look-ahead approach or an at-rule based approach pending the above mentioned discussion we need to have.

tabatkins commented 1 year ago

I agree, we should just switch the spec to the new strategy. I don't see any reason to keep Option 3 in there; it'll just be a speedbump in a few browsers' releases, like many other APIs that had some minor changes after their initial shipping.

css-meeting-bot commented 1 year ago

The CSS Working Group just discussed Unknown.

The full IRC log of that discussion <fantasai> Topic: Nesting Syntax
<fantasai> astearns: plinss's suggestion is we put the lookahead version into the spec
<fantasai> ... and remove references that we had to parts that we had based on old parsing mode
<fantasai> ... is that a way forward?
<fantasai> TabAtkins: that's exactly what I want to do
<fantasai> astearns: so can we resolve the problem by saying that we'll specify the new parsing-enabled version only?
<fantasai> plinss: I'm uncomfortable with that, depends how we phrase it
<fantasai> plinss: I'm thrilled that lookahead proved viable. I think this should take Option 3 off the table.
<fantasai> ... we should ship with lookahead, or something else entirely
<fantasai> ... I don't care if there's one STP or whatever that shipped differently, but what we spec
<fantasai> TabAtkins: I agree completely
<fantasai> jensimmons: I don't
<fantasai> plinss: I still think there are issues with lookahead, but it's far better than Option 3
<fantasai> ... I won't FO provided that we take a step back and review all the advantages/disadvantages
<fantasai> ... and reconsider all the options
<fremy> fwiw, the lookahead option would satisfy me
<fantasai> ... and look at the big picture
<fantasai> ... and whatever we decide as a group, I'll be fine with
<fremy> fremy: my concern was that copy and paste would not be viable, but with lookahead it is
<fantasai> jensimmons: I am disappointed that plinss is once again making demands that we do what he wants, and assures us that if we do he'll be OK
<fantasai> plinss: I haven't changed my position
<fantasai> ... I'm demanding that we follow the process and make rational decisions
<fantasai> jensimmons: What is happening is that browsers have implemented what was specified, and are moving forward despite the situation we've found ourselves in in the past several months
<fantasai> ... and I'm a believer that as we evolve CSS and take an idea, work through it, implement and ship it, evolve it, continue to add more to it or make it better, and implement and ship the newer version
<fantasai> ... we have a method to do this, we have levels for specifications
<fantasai> ... and we keep a trail as we go so browsers can make interoperable implementatoins
<fantasai> ... I don't want to obliterate the spec, because it is shipping. I would like to write up Level 2
<fantasai> ... and browsers can ship that as it's possible to do
<fremy> -1 because they are not compatible
<TabAtkins> (we have a working draft https://www.w3.org/TR/css-nesting-1/)
<fantasai> plinss: we don't have a spec for that, browser shipped outside the W3C process
<fantasai> jensimmons: that's your opinion, don't want to fight about it
<fantasai> plinss: we had contention and two browsers shipped anyway
<fantasai> ... we have no obligation to spec retroactively
<fantasai> astearns: We are out of time
<fantasai> astearns: I would point out that we do have a process for asking the WG "are you ready to ship", and that was not followed in this case
<TabAtkins> We do in fact have an obligation to spec reality. We can try to *nudge* reality, but we're not writing fiction.
<fantasai> ... but we'll have to leave this open. We can have another breakout next week.
<TabAtkins> And we *did* try to follow process, the process was dragged out over several browser's explicit requests to resolve it.
<fantasai> ... Would like ppl to add comments to the issues so we can hash this out and not have new arguments on the call
<fantasai> Meeting closed.
<TabAtkins> The process (the WG) failed to work effectively, and was routed around.
<TabAtkins> And now we're dealing with the consequences. Luckily I made sure that what we went with was compatible with the ideal future, so now moving to the ideal future is easy.
<plinss> And sometimes the process takes longer than people want, but that's still the process, routing around it does not supersede the process
<fremy> To be fair, I will argue the process worked properly here. Without objections, we would not have the lookahead version. It was deemed impossible. Only when so much pushback happened was it properly tested.
<astearns> +1 fremy
<fremy> I think the process yielded a better outcome, but unfortunately some lost faith before it reached its course
<fremy> It's understandable, I would have wanted to ship this too if I was the PM
<fremy> No blame handed out
<fremy> But I think the "work around" was not optimal, here
<TabAtkins> fremy: The "workaround" was required, because the chairs allowed one threatened objection to indefinitely delay resolution. That was, itself, a violation of process, as Jen pointed out in an earlier call when she *cited* the Process and asked for a vote to resolve the logjam, and was rejected by the chairs.
<TabAtkins> More generally, the Process is an way of working toward interop, not a suicide pact. We abide by it because it generally produces good results, and is a useful schelling fence to ground discussions with. But we very intentionally do not *require* W3C signoff on shipping code in Chrome.
<fantasai> Dude, not only do you to not require it, you don't even *ask* for it, even when it would be useful for the CSSWG to have that discussion and even when you would get it. It's really obnoxious.
<fantasai> You ask for TAG signoff, but don't care to ask the CSSWG.
<fantasai> Even when the CSSWG would actually have more useful input.
astearns commented 1 year ago

@jensimmons on the call you expressed concern about the "just switch the spec to the new strategy" option, preferring to keep option 3 in level 1 and creating a new level for the lookahead version. I am wondering what we would gain from that.

As far as I understand, the additional syntax requirements in option 3 would still work with the look-ahead approach (they would just no longer be required). So an option 3 implementation would still conform, just not be complete. This seems good to me - ending up with a specification that addresses more concerns than the current draft spec does.

We do at times have to spec a reverse-engineered version of what is required for web compatibility, but this should be a last resort. I do not think we are at that point yet with this very new feature.

Could you go into more detail about your concerns here in the issue before next week’s breakout?

tabatkins commented 1 year ago

Has issues with error recovery. Since declaration recovery and rule recovery differ, unrecognized rules in declaration blocks will eat the following declaration. This will surprise authors.

This is not the case. Unrecognized rules will stop at the end of the rule (the }), so a following declaration (or rule) will be parsed as normal.

plinss commented 1 year ago

This is not the case. Unrecognized rules will stop at the end of the rule (the }), so a following declaration (or rule) will be parsed as normal.

That might be true in a browser that implements nesting, and manages to detect the rule as a rule and not fall back to treating it as a declaration.

Try this in a browser that doesn't support nesting:

<style>
p {
    color: red;
    span { color: yellow; }
    color: green;
}
</style>
<p>This should be GREEN</p>

or this in a browser that does:

<style>
p {
    color: red;
    custom-element:unknown { color: yellow; }
    color: green;
}
</style>
<p>This should be GREEN</p>

(and if that works, imagine custom-element conflicts with a future property name)

tabatkins commented 1 year ago

Try this in a browser that doesn't support nesting:

Yes, your stylesheet will be incredibly broken if you try and use Nesting in an older browser. Eating the following declaration is the least of your worries, large portions of your sheet simply won't parse.

or this in a browser that does:

Yeah that's fine, it'll parse the rule and stop. (In older browsers it'll eat the next declaration.)

If the element name was a valid property name, that's also fine. We'll try to parse it as a declaration, declare it invalid, then restart and parse as a rule, stopping at the correct place.

The only problem is if we somehow create a property whose name+value looks like a valid rule followed by a different declaration. This is trivial to avoid with a completely reasonable design restriction (if we ever allow a top-level {} in a property value, only allow it at the whole property value; if we're careful with the grammar we can allow stuff before it), but even without that restriction it's honestly hard to imagine a syntax we could ever design that would actually confuse the parser here.

The one case we have to be careful is with custom properties, since we can't control what they're doing, but that's also fine in practice since custom elements can't have dashed-ident names.

I documented all of this in https://github.com/w3c/csswg-drafts/issues/7961#issuecomment-1490495722. The strategy we're advocating here is very robust, actually.

plinss commented 1 year ago

Yes, your stylesheet will be incredibly broken if you try and use Nesting in an older browser

This is likely true if you blindly refactor an existing stylesheet to use nesting, or try to simply serve an existing SASS stylesheet. However, this 'all-or-nothing' attitude is not necessary and breaks with pretty much every other CSS feature ever added. I'm not saying this is an absolute deal-breaker, but it's not something we should YOLO and take lightly.

There are several situations (think small stylesheets in custom elements) where additional nested rules could absolutely be considered progressive enhancement. We are doing a disservice to users if we ignore those use cases.

The only problem is... / The one case we have to be careful is...

You're making my point for me. I accept that the risk is small, but it's not zero. If we use an at-rule, the risk actually is zero. Yes, we should factor the risk against the likelihood, but even a small percentage is still a lot of web pages as you well know. Furthermore, any risk of conflict is a risk that will narrow future language design choices. My point is that this is a factor and a decision the group should carefully consider, not just blow off because one person thinks it's no big deal.

This issue (like the others) is likely small enough we can consider it a paper-cut, but paper-cuts add up and eventually the bleeding becomes significant. I'm fine with the WG as a whole deciding to accept theses issues, I'm not fine with being saddled with these issues without even a rational group discussion.

If we had no alternatives, yeah, we should just proceed with look-ahead, but we do have alternatives. Let's give them a fair trial and not just ignore them because some people slightly preferred something else way back when before we fully understood all the costs/benefits. That's fair, no?

tabatkins commented 1 year ago

This is likely true if you blindly refactor an existing stylesheet to use nesting, or try to simply serve an existing SASS stylesheet. However, this 'all-or-nothing' attitude is not necessary and breaks with pretty much every other CSS feature ever added. I'm not saying this is an absolute deal-breaker, but it's not something we should YOLO and take lightly.

We've done it several times, recently: @layer, @supports, etc. CSS's forward-compatible parsing helps us deal with individual properties upgrading their values over time, and independent at-rules being added to the language, but it fundamentally cannot deal with a new wrapping context. Every single time we add something like this, it means new stylesheets will be inherently broken in old browsers.

(Note tho that this has nothing to do with the precise syntax we choose. Any new syntax for style rules, whether we spell it as a plain rule, as a @nest, or something else, means all such rules will be ignored in older browsers, and your stylesheets will be broken. .foo { color: red; @nest .bar {...} background: blue; } is also broken in exactly the same way; @nest .foo { color: red; .bar {...} background: blue; } is broken even worse since the whole rule gets discarded. There is no reasonable syntax that doesn't significantly break your stylesheet in an older parser.)

Also, if an author does want to limit the theoretical damage, they can either (a) put their properties before the nested rules, as the spec recommends, or (b) put a ; after their rules (the spec doesn't recommend this because it's completely unnecessary if you just put your properties first).

There are several situations (think small stylesheets in custom elements) where additional nested rules could absolutely be considered progressive enhancement. We are doing a disservice to users if we ignore those use cases.

I do not understand how this could be the case. Can you give an example of a stylesheet where an author might add nested styles that they are okay with not applying? Why would they write such a stylesheet rather than writing them as unnested rules? Assuming they have such a reason, why would they, specifically, write these with the unnested properties after the nested rules? (If they put the nested rules after, as the spec recommends and common practice follows, then only the nested rules are dropped; the properties in the top-level rule parse as normal.)

My point is that this is a factor and a decision the group should carefully consider, not just blow off because one person thinks it's no big deal.

We did consider it, a year ago. The entire WG discussed it, we resolved that Option 1 was fine with its design constraints, then later that we preferred Option 3's even looser syntax, with attendant slightly higher design constraints. Now we have an even better approach that is both more flexible for authors and less restrictive on future design. Assuming that our previous resolutions weren't an affirmative vote for heavier design restrictions, the previous resolutions hold here; this is a better result that would have satisfied the past WG as well, had we known it was possible at the time.

This WG's work cannot be indefinitely held up by endlessly relitigating existing resolutions without new data. Continuing to imply that this is me, personally, ramming things thru the WG, rather than the entire WG reaching consensus (with a threatened objection) is both incorrect and offensive.

plinss commented 1 year ago

.foo { color: red; @nest .bar {...} background: blue; }

Is not broken, old browsers see the @nest as a rule and accept the background property. Try it.

Also, if an author does want to limit the theoretical damage, they can either (a) put their properties before the nested rules, as the spec recommends

Seriously, how many authors actually read specs? There are all sorts of best practices listed in specs that hardly anyone uses. They mostly cargo-cult things they see elsewhere and there have been tons of common practices in the past that were exactly the wrong thing to do.

It's not about whether knowledgeable authors can limit the damage, it's about the rest of the authors that will be surprised by the unexpected behavior.

Can you give an example of a stylesheet where an author might add nested styles that they are okay with not applying?

div[this=that].something:open {
    span { box-shadow: 5px 5px 5px #888; } /* This is decorative and I don't care if this applies or not, but I don't want to repeat the long parent selector because people said DRY and nesting is cool */
    background: black; /* I didn't read the spec and I like to put nested rules first because I find it easier to read */
    color: white;  /* surprise! white on white text for several billion users on cheap devices that can't upgrade yet that I never considered or tested on */
    ....
}

both more flexible for authors and less restrictive on future design

The look-ahead approach is not more flexible than an at-rule, just slightly less typing, and an at-rule is less restrictive, see above.

This WG's work cannot be indefinitely held up by endlessly relitigating existing resolutions without new data

My point is that there is new data. Several of the points I've brought up recently about language design restrictions and error recovery came as a surprise to several members on recent calls, including those who participated in past decisions. That merits a re-evaluation. Several of the early decisions on syntax were mostly based on preferences and did not weigh all the consequences, because they weren't all well known by those expressing their preference.

Also, we can spend far more time debating whether we should be debating this, than simply laying out all the options, the plusses and minuses of each, and just making a call (and not considering inertia or 'shipped' implementations which were never signed of on by the WG). So let's just do that. I'm willing to live with the outcome of that whatever it is, are you?

Continuing to imply that this is me, personally, ramming things thru the WG, ...

I'm not intending to imply that it's you personally ramming things through the WG, apologies if I came across that way. However several people have been trying to force resolutions on nesting without proper notice or time for discussion. Several decisions have been rushed and several times people were unwilling to revisit past decisions even after new information was brought forward. I'm not the only one who's observed 'go-fever'.

You have been expressing your personal opinions about the relative importance of issues, which it totally fine, but those doesn't necessarily represent WG consensus (neither do mine) so can't settle a disagreement on their own. I accept I wasn't part of every historical conversation about nesting, so if I'm interpreting something as a personal opinion where there is WG consensus, please feel free to point me to the minutes and resolutions.

LeaVerou commented 1 year ago

@plinss after reading the recent developments in this thread, I'm a little confused about what you still see as an issue with the currently proposed syntax, apologies if I have missed something. When you were pushing back against option 3 and urging implementers to really research backtracking, I supported this and opened #7961 with a proposed algorithm to minimize perf impact, which served as a starting point, @tabatkins improved on it further, and eventually that discussion resulted in backtracking proving viable, thanks to the incredible efforts of @andruud et al, and we all rejoiced.

But at this point, I just …don't understand what the further pushback is for, or what you are proposing. IMO the current algorithm is better than we could have ever hoped, it's easy to understand for authors, concise to type, fully compatible with @scope, and as compatible with Sass as possible without introducing undesirable warts into the language. Since this is Agenda+, could you please summarize your concerns and your proposed solution to them? Thank you!

plinss commented 1 year ago

@plinss after reading the recent developments in this thread, I'm a little confused about what you still see as an issue with the currently proposed syntax, apologies if I have missed something.

Ok, let me try to be clear. I'm very grateful for all the work that went into the current look-ahead proposal. I believe it's a large improvement over our previous option 3, and I don't see any room for improvement at the moment. I'm prepared to live with it.

However, it still does have issues, see my list above. These are far less severe than option 3's issues, but remain issues.

We also have the option of an alternate approach, using an at-rule, in one (or even several) of the previously proposed forms. That approach eliminates all the issues with the look-ahead approach at the cost of a small amount of typing for authors.

All I'm asking for at this point is a sanity check. If we pull the trigger on the look-ahead approach, there are ramifications that we're going to have to live with for a very long time. It is a parser change at the end of the day, and as I have said previously, those always prove dangerous.

I want the WG to take a beat, look at all the options, consider all the advantages and disadvantages of each, and make a final consensus-driven call, rather than just move ahead due to inertia and exhaustion.

LeaVerou commented 1 year ago

@plinss Thank you for clarifying. I seem to recall several times when the WG debated these \@rule-based syntaxes, and we always resolved against them, even when the alternative was option 3, which is sigificantly worse than what is currently possible. Why do you think another discussion will resolve differently, now that the alternative is better? Is there new data that might contribute to a different decision?

plinss commented 1 year ago

I'm not asserting that the WG will resolve in favor of an at-rule based approach, or even seriously trying to bring about that outcome (though I do currently favor it slightly). I did propose the look-ahead approach after all, and fought hard to make it happen, so I am obviously fine with it.

My concern is that a non-at-rule based approach has long-term ramifications, there are language constraints and author foot-guns still (though much smaller ones than we had before). And I'm not convinced that all the WG members fully understood those during the previous decisions. As I said above, when I mentioned some of them in calls, there was surprise, so it's obviously new data to some that may have changed the prior outcomes, and is therefore cause to at least re-confirm the prior decisions in light of this new data.

I really just want affirmative consent from the WG, given the current understanding of all the costs before we continue down the current path. When judging those costs, I feel it's essential to consider them in light of an alternative approach, because this isn't a case of 'look-ahead or nothing', we do have another option and nothing us is stopping us from taking it except ourselves.

At the end of the day, if we do an at-rule based approach now, we can always add the look-ahead approach later and make the at-rule optional. (I'd also feel a lot more comfortable making that call later when things have settled down and emotions and stakes aren't running as high as they are now, making decisions under pressure tends to run towards bad outcomes.) Once look-ahead ships (for real, with WG consent), we can never walk it back if we later find the costs are too high and will be stuck with them for the foreseeable future.

As I said, once again, parser changes are dangerous, we need to treat them with care. While an at-rule solution won't make as many people happy today, it's a lot safer, won't potentially cause problems down the road, and doesn't preclude us taking the next step later.

FremyCompany commented 1 year ago

FWIW, I also do not think an at-rule approach has any future. The fact it was rebuked in the working group several times is not a cause of this, but just a symptom of the real cause. The whole point of the nesting feature is to be easier to type and maintain than repetition, so adding an extra indentation level and a sizeable prefix will get a strong opposition from authors, because it's defeating the point of the feature. They will continue to use a module like SASS or LESS or SCSS to compile their code because they will offer a better alternative. If we do not provide authors with a solution at least as elegant as these engines, we might as well not provide anything.

The look-ahead approach, while much better than option 3, still has some issues, namely:

  • Requires parser changes - these have bitten us in the past more often than not and must not be taken lightly.

I would argue that the parser changes proposed here are marginal, and will not affect current authors at all.

  • Still has some restrictions on future syntax, they are much smaller than option 3, but they exist.

Not being able to add a top-level bracket in a css property value in a position other than the first. This is not a real restriction, we would not consider this at all. I would probably argue that we should never introduced top-level brackets in css property values, this is un-css and confusing. We should at the very least always wrap that in a function.

  • Has issues with error recovery. Since declaration recovery and rule recovery differ, unrecognized rules in declaration blocks will eat the following declaration. This will surprise authors.

Do you have an example?

  • We still need to define feature detection somehow.

I agree, but this is orthogonal. There is currently no way to detect the support of an at-rule either.

  • The syntax, while SASS-like, doesn't have quite the same behavior.

I am not that familiar with SASS, but can someone explain what those differences are?

plinss commented 1 year ago

The whole point of the nesting feature is to be easier to type and maintain than repetition, so adding an extra indentation level and a sizeable prefix will get a strong opposition from authors, because it's defeating the point of the feature

The repetition we save is in the selector list, which can be huge, that doesn't change with an at-rule or any other required prefix. Adding an & wasn't seen as an unacceptable burden, I don't see why 4 extra characters would make it so. Extra indentation isn't necessary (may be desired in some cases) but hardly defeats the point of nesting. That's a bit hyperbolic. Also, @nest isn't all that large as far as at-rule names go, we could even make it shorter if 5 characters is an untenable burden at the cost of readability. (I wish we could use a bare @ but older browsers don't treat that as an at-rule.)

There are also multiple forms an at-rule approach can take, and as I said, we can support several or all of them simultaneously.

For example: 1) @nest [selector-list] { [declaration-block] }, allowed inside declaration blocks. No extra indenting.

div {
  background: blue;
  @nest span {
    background: green;
  }
  @nest b {
    background: yellow;
  }
}

2) @nest [selector-list] { [list-of-nested-rules-only] }, allowed at top level only. Adds potential indenting for primary rule declarations.

@nest div {
  {
    background: blue;
  }
  span {
    background: green;
  }
  b {
    background: yellow;
  }
}

3) @nest { [list-of-nested-rules-only] }, allowed inside declaration blocks only. Adds potential indenting for nested rules.

div {
  background: blue;
  @nest {
    span {
      background: green;
    }
    b {
      background: yellow;
    }
  }
}

All of the above provide author flexibility and avoid the mixed rules and declarations problem.

I would argue that the parser changes proposed here are marginal, and will not affect current authors at all.

The overall concern about parser changes isn't about impact to authors, it's about impact to the language. The real risk is always things we're not foreseeing now that will come around and bite us later when we try to add something new.

Do you have an example?

See above and several other conversations about how declaration error recovery and rule error recovery differ. By putting rules in a place where declarations are expected, we run the risk of using the wrong recovery mode, which can eat valid following declarations and lead to site breakage in ways that will surprise authors.

The primary benefit of an at-rule approach is that all browsers that ever supported CSS drop into rule recovery mode when they see an at-rule in a declaration block, so there are no side-effects or surprises to authors. It's been the extension mechanism from day 1, and we're proposing to not use it.

I agree, but this is orthogonal. There is currently no way to detect the support of an at-rule either.

It's orthogonal, but shipping nesting without feature detection makes the nesting feature useless for anyone who cares about supporting older browsers, especially during the transition phase as the feature rolls out and isn't universally available (we really don't want to encourage 'best viewed in X' again, do we?). Authors will either ship only nested stylesheets and break their sites for older browsers; ship both, making the user pay unnecessary download costs (violating one of the TAG's ethical principles); or just use a transpiler and ship un-nested stylesheets like they aready do today, and we don't need to bother with the feature.

@supports (@nest) or @supports at-rule(nest) would be trivial to add and obvious to authors. We currently don't have a good answer how to feature detect the look-ahead version. So this is a factor to consider.

FremyCompany commented 1 year ago

Authors will either ship only nested stylesheets and break their sites for older browsers; ship both, making the user pay unnecessary download costs (violating one of the TAG's ethical principles); or just use a transpiler and ship un-nested stylesheets like they aready do today, and we don't need to bother with the feature.

Even if you could detect support client-side with @support (@nest) it does not change this equilibrium. You either ship the code twice, or you don't ship either version at all. Or you ship the legacy one in an @import which makes legacy browsers download more and have more latency. There is no way to win, here.

But this is nothing new. Javascript has had breaking syntactic changes all over the years. And, yes, for a while, you have to rely on UA sniffing to decide wether to ship the (shorter) modern code or its desugared version. Authors have been sucessfully doing this for years now. There is little downside to getting this wrong, too. Your site will work in both modes, so at worst you are missing out on saving some network bytes on a browser that does support nesting but you weren't aware of.

At the current rate at which browsers ship, in less than 5 years, 99.9% of browsers will support nesting natively and you will no longer need the transpiler at all. Just like everyone feels free to use ES5 today.

astearns commented 1 year ago

With my chair hat off, I am not convinced that we have enough new information to reconsider a wide range of previous resolutions on nesting. The main new information we have is that look-ahead is viable, which to my mind only strengthens the option 1->3->look-ahead evolution. It was consistently preferred (flaws and all) over other proposals, and the look-ahead improvement is likely to increase that preference.

My impression is that we have discussed error recovery issues, restrictions on future syntax, differences between our nesting and SASS, and the perils of changing parsing. Aside from how long it took to get someone to investigate parsing changes, I think the current proposal has been adequately defended against those challenges. I am not aware of new information on any of these that would change my mind, at least.

I am concerned about adequate feature detection, and I am still unclear how authors are meant to use the current proposal in projects that support pre-nesting browsers. But I believe this is a problem for all of the alternatives I have seen.

All this to say I don’t yet see enough evidence in this issue that we should change course. That’s just my personal opinion, and of course I welcome attempts to change my mind.

With my chair hat back on, I am going to move the question of whether the current draft should be updated to the look-ahead-enabled syntax. Since both of those options do mix properties and selectors, that question is I think separable from this issue. I have added a comment to 7961 which seems like the right place for that resolution.

plinss commented 1 year ago

My impression is that we have discussed error recovery issues

If that's the consensus, then so be it, I'll stop complaining, but before we get there, let me point to evidence that error recovery at least hasn't been fully considered in past decisions. See in this thread alone: @FremyCompany asking for examples of the error recovery problem (indicating he's not intrinsically familiar with the issue), yet expresses a strong bias against at-rules; Tab (who I know fully understands the issue), and myself both still getting error recover behavior wrong during conversations; several people expressing surprise during relatively recent calls about the error recovery and syntax restriction issues (well after at-rules were rejected).

My argument is that sufficient evidence exists that all the factors weren't taken into account during previous decisions to warrant at least a 5-minute overview of the relative costs and benefits of the two approaches and a simple sanity-check resolution, confirming we all know what we're buying into. Once again, I'm not advocating against adopting the look-ahead behavior, I just want us to do it with our eyes open and decisions made properly.

If the choice were simply a mandatory @nest prefix or not, with no other issues or consequences, I wouldn't even be raising this, of course we wouldn't require the prefix. But the choice is between look-ahead, which has other costs and issues, and @nest that's slightly more verbose, but has no other issues.

plinss commented 1 year ago

And, yes, for a while, you have to rely on UA sniffing to decide wether to ship the (shorter) modern code or its desugared version.

You're not seriously suggesting UA sniffing is the proper way to deploy this feature, are you? I'm somehow thinking that's not going to pass TAG review.

One could argue client hints would be appropriate, but then work needs to be done to specify and implement that.

LeaVerou commented 1 year ago

And, yes, for a while, you have to rely on UA sniffing to decide wether to ship the (shorter) modern code or its desugared version.

I suspect that rather than UA sniffing, authors will just wait a bit until Nesting is supported in a reasonable percentage of browsers, then detect it client side and load an additional stylesheet only for those browsers that don't support Nesting. I believe the styling you get from a stylesheet that uses Nesting in an older browser is a proper subset of all the rules, so that would be a largely smooth progression, it would just involve 2x the download for older browsers, which is a commonly acceptable tradeoff once the set of nonsupporting browsers for a feature becomes small enough. In fact, build tools may even evolve to only ship the nested rules rewritten, so they could work as supplementary and not load the root rules twice (though this may have different results due to rules being out of order, but perhaps the tools would be smart enough to detect that).

romainmenke commented 1 year ago

I don't want to take this thread even further of topic because the focus here isn't how authors will deal with the transition period, but there are too many inaccuracies here :)


https://github.com/w3c/csswg-drafts/issues/8249#issuecomment-1501062075

But this is nothing new. Javascript has had breaking syntactic changes all over the years. And, yes, for a while, you have to rely on UA sniffing to decide wether to ship the (shorter) modern code or its desugared version. Authors have been sucessfully doing this for years now.

This isn't exactly true. Very few people roll their own UA sniffing. Most ship a single bundle targeted at the oldest browsers they need to support.

Others rely on services like polyfill.io which abstract away the UA sniffing. UA sniffing is a very complex problem to solve because browsers lie, have bugs and because of recent privacy preserving changes.


https://github.com/w3c/csswg-drafts/issues/8249#issuecomment-1501062075

At the current rate at which browsers ship, in less than 5 years, 99.9% of browsers will support nesting natively and you will no longer need the transpiler at all. Just like everyone feels free to use ES5 today.

This is not true. Not even flex has 99.9% adoption and that shipped 10 years ago.


https://github.com/w3c/csswg-drafts/issues/8249#issuecomment-1500997885

just use a transpiler and ship un-nested stylesheets like they aready do today, and we don't need to bother with the feature.

Authors don't want to use tools for anything. If nesting never ships it will also never reach the point when tools become unneeded.


https://github.com/w3c/csswg-drafts/issues/8249#issuecomment-1501737869

I suspect that rather than UI sniffing, authors will just wait a bit until Nesting is supported in a reasonable percentage of browsers, then detect it client side and load an additional stylesheet only for those browsers that don't support Nesting. I believe the styling you get from a stylesheet that uses Nesting in an older browser is a proper subset of all the rules, so that would be a largely smooth progression, it would just involve 2x the download for older browsers, which is a commonly acceptable tradeoff once the set of nonsupporting browsers for a feature becomes small enough.

Authors typically do not like a change that will both slow down their project for end users and add to the complexity of their stack. Even when it only causes perf issues for users on older browsers.


How authors will deal with the transition period is not the issue imho. This is a solved problem.

Nesting is just syntactic sugar, so authors don't lose out on much by transpiling.

They will simply continue to transpile with existing tools and it will be rare for anyone to actually ship nested css. They will continue to do this for a long time because the benefits of transpiling will continue to outweigh the drawbacks.

This is not a bad thing. There simply already is a trivial way for authors to write standard/native css and support the older browser version that their project requires.

LeaVerou commented 1 year ago

I feel we need a new thread to discuss how authors will handle the transition period, and what we can do to improve that experience.

But, some replies to @romainmenke above:

This is not true. Not even flex has 99.9% adoption and that shipped 10 years ago.

It has 99.23%, which is pretty close. And authors do use Flex (and grid) without shipping alternatives. They don't generally need features to reach 99.9% support to use them without backup, anything over ~93% or so tends to be considered good enough for that in my experience. (and let's not forget that even font-size only has 96% (!))

Authors don't want to use tools for anything. If nesting never ships it will also never reach the point when tools become unneeded.

I'm a bit confused at this. The rest of your comment seems to be making the point that authors will simply never use native Nesting and will just continue to preprocess until the end of time. Here you're saying authors don't want to use tools for anything. Will they, or won't they use tools to transpile Nesting?

Authors typically do not like a change that will both slow down their project for end users and add to the complexity of their stack. Even when it only causes perf issues for users on older browsers.

I disagree.

romainmenke commented 1 year ago

@LeaVerou I think the tone/intent of my comment got lost. I think we are saying similar things.

I feel we need a new thread to discuss how authors will handle the transition period, and what we can do to improve that experience.

Yes, that would be better than taking this further of topic :)


And authors do use Flex (and grid) without shipping alternatives. They don't generally need features to reach 99.9% support to use them without backup

The incorrect statement was that any feature would reach 99.9% in five years. If numbers are used to back things up, it is important that these are somewhat accurate.

I didn't state anything about what 99.9% or any other level of support means for CSS authors, only that 99.9% in five years will be extremely unlikely.


Authors don't want to use tools for anything. If nesting never ships it will also never reach the point when tools become unneeded.

I'm a bit confused at this. The rest of your comment seems to be making the point that authors will simply never use native Nesting and will just continue to preprocess until the end of time. Here you're saying authors don't want to use tools for anything. Will they, or won't they use tools to transpile Nesting?

This was in response to :

Authors will either ship only nested stylesheets and break their sites for older browsers; ship both, making the user pay unnecessary download costs (violating one of the TAG's ethical principles); or just use a transpiler and ship un-nested stylesheets like they aready do today, and we don't need to bother with the feature.

Arguing that the tools required during the transition period make the entire feature redundant doesn't make sense, exactly for the reasons you listed.


Authors typically do not like a change that will both slow down their project for end users and add to the complexity of their stack. Even when it only causes perf issues for users on older browsers.

I should have elaborated a bit more here.

If we imagine that your proposal is supported by tooling, authors will have two choices :

  1. produce a single, transpiled stylesheet during the transition period
  2. produce two stylesheets, one for modern and one for older browsers during the transition period

Option 1 doesn't require CSS Authors to make any other changes to their project. Simply pass the stylesheet to whatever tool that can desugar nesting and what comes out will work everywhere. After the transition period the transpiling tool can be disabled/uninstalled.

Option 2 requires CSS Authors to:

Option 1 is simpler and given that nesting is purely syntactic sugar there are few drawbacks to transpiling during the transition period.

Most transpiling tools make it trivial to skip certain transforms during active/local development. Which give the benefits of rapid prototyping as you mentioned.


If this were true, nobody would be using polyfills, which are regularly much slower than writing code without the new technology that is being polyfilled in the first place. With polyfills one typically exchanges speed in modern browsers and codebase simplicity for slowness in older browsers, and this is a tradeoff I've seen authors make over and over.

The choices I am weighing are :

When the second option is also more complex to setup, it doesn't make much sense to go down that road purely for the transition period.

Some authors might chose to do so, and that is fine.

My argument was not that other strategies are invalid, only that a simpler alternative already exists : transpile for everyone for as long as required for a given project.


When I mention transpile or transpiler I am never talking about Sass, only about tools that aim to desugar standard css nesting.

astearns commented 1 year ago

Perhaps we can move the transition discussion to https://github.com/w3c/csswg-drafts/issues/8399? Or should it be a completely new issue?

css-meeting-bot commented 1 year ago

The CSS Working Group just discussed [css-nesting] Problem with mixing properties and selectors, and agreed to the following:

The full IRC log of that discussion <fantasai> scribenick: fantasai
<fantasai> astearns: We have this one issue, so I want to set aside the question of feature detection and down-compat
<fantasai> ... and also set aside question of Option 3 vs other SASS-like syntaxes
<fantasai> ... and go directly to, are we on the right path on a SASS-like syntax
<fantasai> ... or should we consider some other type of syntax?
<fantasai> astearns: What I would like to do is look at the argument being made in this issue around the drawbacks of a SASS-like syntax
<fantasai> ... and make the strongest case that we can, without arguing against anything yet, without coming up with counter-arguments, just going with the strongest argument we can for "we're on the wrong path" and see if anyone on the call agrees we should be looking at alternatives at this point
<fantasai> ... plinss, does this sound like a fair way forward?
<fantasai> plinss: yes, but we have to discuss the drawbacks of existing SASS
<fantasai> astearns: I'm suggesting we identify the drawbacks, but don't argue them
<fantasai> ... come up with the strongest presentation of the drawbacks, before we tear them apart
<fantasai> ... no no counter-arguments yet
<fantasai> astearns: Issue is about SASS-like syntax vs something else (such as an at-rule)
<fantasai> ... plinss has identified 4 drawbacks for a SASS-like syntax
<fantasai> astearns: Issues with error-recovery
<fantasai> astearns: Restrictions on future-syntax
<fantasai> astearns: Doing something SASS-like, but not actually SASSy (concatenation)
<fantasai> astearns: Ideal syntax requires changes to parser model
<fantasai> astearns: With that summary, is there anything someone wants to add that would strengthen this list of concerns?
<fantasai> plinss: Want to clarify my position
<fantasai> ... I didn't like the proposed direction
<fantasai> ... but because I have concerns
<fantasai> ... I like how we're going, but I also think the at-rule approach has merits because it doesn't have the drawbacks
<fantasai> ... My other concern is that, I think we look at both issues, and we've said "it's not great but it's not that, but if it's an at-rule or SASS-like, we chose SASS-like"
<fantasai> ... I'm not sure we've actually looked at all the issues together, and asked, in aggregate, are we still on the right path
<fantasai> ... Rather than looking at each decision in isolation
<fantasai> ... I think the issues are manageable, any one of them is not a reason to reject; but together might be enough
<fantasai> ... I'm leaning towards at-rule because it's safer
<fantasai> ... once we go down path of SASS-like syntax, we're stuck with its drawbacks forever
<fantasai> astearns: Of everyone on this call, is there anyone who has the same concerns?
<plinss> https://github.com/w3c/csswg-drafts/issues/8249#issuecomment-1496776853
<fantasai> SebastianZ: Thread is long, can we point to the comment summarizing the concerns?
<fantasai> astearns: We can get into that, but we risk running into an isolated argument
<fantasai> SebastianZ: I can see the point that plinss is outlining
<fantasai> ... these are problems
<fantasai> ... and error-recovery is the most important one
<TabAtkins> q+ about error recovery
<fantasai> ... which would be no problem if we went the at-rule route
<TabAtkins> Zakin, shut up
<fantasai> plinss: Just want to confirm that everyone understands the issues, e.g. what do I mean by error-recovery problem
<fantasai> astearns: OK, let's going to details of error-recovery
<fantasai> ... but keep going with approach of making the strongest possible case
<fantasai> astearns: In my opinion, the worst bit of error-recovery problem is, the current proposal allows for rules to be dropped in error-recovery in a mixed selector-and-property syntax
<fantasai> ... that would not otherwise be dropped
<fantasai> ... so the order in which you declare your properties and nested selectors
<fantasai> ... can cause parts of the stylesheet to be ignored
<fantasai> plinss: If you have a rule that's dropped, it can eat a following declaratoin
<fantasai> ... normally within a declaration block, we're in declaration recovery
<TabAtkins> q+
<fantasai> ... if we mix with rules, they have different error-recoery behavior
<astearns> ack TabAtkins
<fantasai> ... if you're recovering from a declaration, but you're parsing a rule, you're going to eat the next declaration
<fantasai> TabAtkins: went through details of error-recovery
<fantasai> ... if talking about eating of things unexpectedly, there are two potential ways
<fantasai> ... One is an old browser seeing nesting code
<fantasai> ... firstly, this will wreck your stylesheet anyway
<fantasai> ... You can construct an artificial scenario where optional things are nested, but it's very narrow and strange
<fantasai> ... usually necesary rules are written there, if you lose them your page is broke
<fantasai> ... The fact that the following declaration might get eaten in addition to rules being dropped, you have a problem anyway
<fantasai> TabAtkins: the other concern is in a browser that does understand nesting, what can happen if you have invalid rule followed by a declaration
<fantasai> ... or invalid declaration followed by a rule
<fantasai> ... details in the post
<fantasai> ... but with parsing change that's already in the Syntax spec
<fantasai> ... where a nested rule -- if you see a semicolon in the prelued (before {}), we immediately stop and throw it out
<fantasai> ... with that change, it's a stable, predictable amount of things being thrown out
<fantasai> ... if we parse an invalid rule, we read until we get to the semicolon after the rule
<fantasai> ... restart, try to parse as a rule, and then [missed]
<fantasai> ... The other way around, invalid declaration followed by invalid rule
<fantasai> ... we first parse until invalid declaration ends (;) , then restart as a rule but abort on the semicolon again anyway
<fantasai> ... and then parse the rule afterwards
<fantasai> ... There's a small corner-case around custom properties, because you can put anyting in them and might have something that looks like a rule inside, or [missed2]
<fantasai> ... but even an at-rule based syntax will interfere with custom properties in that way, so either way we have a problem
<fantasai> ... if you do something weird enough
<fantasai> TabAtkins: so in a nesting-capable browser, there's very little possibility of eating a following declaration by accident
<fantasai> ... and in nesting-incapable browser, you have major problems anyway; and you should be putting your declarations at the top anyway, which avoids all such problems
<fantasai> astearns: wanted to avoid counter-arguments ...
<fantasai> ... SebastianZ, enough explanation?
<fantasai> SebastianZ: yes
<fantasai> plinss: I'd like to counter
<fantasai> ... Nothing Tab said that's incorrect, except we differ on relative importance
<fantasai> ... Obviously if someone buys into nesting and they go all-in, yes the entire stylesheet is b0rked and need not go into details
<fantasai> ... but CSS is designed for progressive enhancement
<fantasai> ... a lot of websites, it isn't a single person writing a stylesheet; all aggregated together
<fantasai> ... you might have one person sprinkling a little nesting here and there
<dbaron> Scribe+ dbaron
<fantasai> ... and they are very likely to not read the spec and know that nested rule should go last
<fantasai> ... and might end up dropping a significant amount of a declaration
<jensimmons> q+
<fantasai> ... and it's also entirely possible that they won't see that, because they're on a modern machine, and not on old low-powered device
<TabAtkins> Again the nested rule is *also* getting dropped. Page is already broken.
<fantasai> ... so I think the likelihood is higher than Tab thinks
<fantasai> plinss: Wrt browsers that do support nesting, I agree that the current approach is robust except wrt custom properties
<fantasai> ... my concern is what happens in the future? If we want to add some new capability, new combinator or other strange syntax
<fantasai> ... might restrict ourselves from what we'd do otherwise, or have a risk of some problems
<astearns> ack fantasai
<fantasai> astearns: that's example of how things are tangled up
<dbaron> fantasai: I wanted to address question of downlevel clients and progressive enhancement
<dbaron> ... I think authors will adapt practice of putting nested rules after declarations, generally what they do already. Not too concerned about that case.
<dbaron> ... Usually not going to be doing this in progressive enhacement way, and if so follow best practice.
<dbaron> ... more likely to see nested rules inside of @-rules.
<dbaron> ... unlikely to see progressive enhancement of bare nested rules followed by a declaration. Not too worried about that case.
<matthieudubet> q+
<fantasai> astearns: when I see ppl using PostCSS examples, I immediately found declarations following after rules, it's common
<fantasai> plinss: I agree the risk is small, percentage is low, and web is vast
<fantasai> ... can still affect millions of people
<astearns> ack jensimmons
<fantasai> jensimmons: I have a question, because even though this issue is very long, there's not a simple, clear explanation that I could find of what gets eaten.
<fantasai> ... so here's the question
<fantasai> ... could someone very simply explain what gets eaten?
<TabAtkins> here's the maximum dangerous situation: `<new-wacky-property-syntax>: {...} more-stuff;`. If they write this invalidly, or use in a Nesting-capable browser that doesn't understand the new syntax, what's the maximum damage?
<fantasai> ... perhaps focused on borwsers that do support nesting
<fantasai> plinss: in a browser that does support nesting, very little chance of things getting mis-eaten, except custom properties
<fantasai> ... bigger concern is down-level clients
<fantasai> ... becaue what gets eatin is a rule and the following declaration
<fantasai> ... if I start a rule, followed by a declaration, and the browser doesnt' understand that the rule is a rule and parse it as such
<fantasai> ... it will go into declaration error-recovery, and eat the rule *and* the next declaration (up to the semicolon)
<fantasai> jensimmons: And in browsers that do support nesting, something that gets eatn?
<matthieudubet> q-
<fantasai> astearns: only in custom properties where the custom property value is using a brace
<fantasai> jensimmons: So if using nesting, and the thing they nest is something that doesn't make sense e.g. mispell the selector
<matthieudubet> q+
<TabAtkins> Damage is: since decl parsing failed, we restart as rule. We parse until the {}, and stop. Then we'll start parsing the stuff *after* the {} fresh. It's *theoretically possible* for this to be mistaken as a valid declaration or rule, rather than the suffix of an invalid declaration. But as long as we (the CSSWG) only design top-level {}-in-declarations to be the whole value, this isn't a problem.
<fantasai> ... what happens to the rules after that nesting thing?
<fantasai> astearns: I believe we're ok. Malformed thing gets dropped, and in browser that supports nesting everything after the dropped rule is retained
<fantasai> ... right?
<fantasai> plinss: I don't think there's a situation where we drop a rule [...]
<fantasai> ... what gets dropped is a declaration that doesn't need to be dropped
<fantasai> jensimmons: in browsers that don't support nesting
<SebastianZ> q+
<fantasai> plinss: right. Also in browsers that do, but in that case much more of a corner case. But in downlevel browsers much more common
<fantasai> jensimmons: so in Nesting-supported browsers, would only have a problem in malformed case or a particular strange custom property value
<fantasai> TabAtkins: After nesting is supported, it's not possible to eat a following declaration after anything invalid
<fantasai> ... but if something is in valid, we might parse as a rule and the leftover stuff (that we didn't get to) could maybe be interpreted as a declaration
<fantasai> ... but we shouldn't need to introduce such constructs
<fantasai> plinss: Something that would have been interpreted as a preceding rule, you wind up with the tail end of that interpreted as another declaration or rule.
<fantasai> ... agree that's a corner case
<fantasai> plinss: In the future, we might end up introducing problems
<fantasai> TabAtkins: we need two restrictions, one which is already in the spec, to avoid that
<astearns> ack matthieudubet
<fantasai> matthieudubet: If we mandate in the nesting syntax to have declarations first, then style rules, so you can't mix, don't we get rid of the issues?
<fantasai> matthieudubet: since we know we can't mix, we don't have the issue
<fantasai> matthieudubet: this is similar to Option 4, two blocks, but you don't write the braces
<fantasai> matthieudubet: declarations first, and style rules after
<fantasai> plinss: There's no rule for behavior that will change existing browers
<fantasai> ... so even if we say it's invalid to put a declaration later, an older browser will be in declaration mode
<fantasai> matthieudubet: but there's less risk of this happening, because nobody would want to write such a thing (since not supported in new browsers either)
<fantasai> ... it's the same error-recovery, but you mitigate the risk of having this in actual style sheets
<jensimmons> q?
<fantasai> plinss: I think that's a valid approach... not sure that restriction on the SASS-like syntax is worth the benefit
<fantasai> ... but it is a viable path
<astearns> akc SebastianZ
<emilio> q+
<fantasai> SebastianZ: So if I understnad correctly, authors could put semicolon after the rule, and the next declaration wouldn't get eaten
<fantasai> TabAtkins: correct
<astearns> ack SebastianZ
<fantasai> SebastianZ: So just for clarification, one could use that to support all the browsers that don't support nesting
<jensimmons> q+
<fantasai> ... of course they're still skipping the nested rule, but at least we don't have the next declaration gettin eaten
<astearns> ack emilio
<fantasai> plinss: We could mandate a semicolon after nested rules, to force doing that everywhere
<fantasai> emilio: I think I like the declarations to be forced before
<fantasai> ... especially since the behavior of interleaving is the same as if they were sorted before anyway
<fantasai> ... if that satsifies everyone, then I think it's best
<fantasai> ... mostly because that way we don't have the weird problem of explaining
<fantasai> ... you put this after the nested rule, but the browser created an anonymous rule that put all teh declarations first
<fantasai> ... and acts as if you had written it before
<fantasai> ... so I think that might be nice
<TabAtkins> that's the parser switch that all the impls hated before...
<astearns> ack jensimmons
<fantasai> +1 emilio, this clarifies the cascade
<fantasai> jensimmons: I dont' like the idea of requiring that declarations go before nested rules
<fantasai> ... like plinss says, people jam CSS into things all over the place
<fantasai> ... it will make things less useful
<emilio> q+
<fantasai> ... and I always prefer to go for the better choice long-term and deal with limitations short-term when redesigning language
<fantasai> jensimmons: wrt transition, my sense is that majority of authors won't use Nesting for 3-4 years
<fantasai> ... those who do use it sooner, what I have been seeing is that they're using a preprocessor to do it
<fantasai> ... so writing their own CSS nested, and then using some automated stack to process it back out into old-school CSS
<fantasai> ... similar to using SASS
<fantasai> ... I think that might really take off, the tools ppl build for that will be popular
<TabAtkins> in particular, "no properties after rules" means at some point we need to decide we're "in rules". This requires some way of detecting "oh i tried to parse this as a prop but failed, looks like a rule" that won't misfire in weird cases.
<fantasai> ... and they will preprocess it for older browsers
<TabAtkins> the end result is actually *at least* as complex as today, possibly more
<fantasai> ... I don't think there will be an epidemic of ppl writing progressive nested rules, they're too lazy to do that
<fantasai> ... they already don't think about progressive enhancement nearly enough
<astearns> eck emilio
<fantasai> emilio: Reason I think this restriction is nice, not just for parsing, but let's say you don't know how nesting is specced
<astearns> q+
<TabAtkins> q+
<fantasai> ... you put media query inside and then more declarations at the bottom
<fantasai> ... that won't work like you expect
<astearns> ack emilio
<astearns> q--
<fantasai> ... because the declarations are shifted up, and in some cases not a problem but I'm sure it will confuse people
<fantasai> jensimmons: You mean, they dont' cascade as expected?
<fantasai> emilio: right
<fantasai> ... the rules inside the @media will be later in cascade order than the declarations outside @media but after it
<fantasai> ... I would find it confusing if I didn't know internally how it works
<fantasai> ... same for dropping bare declarations inside MQ and stuff, which we resolved they would be put into an anonymous rule at the front
<fantasai> ... if you don't know that they get moved, it's super confusing
<jensimmons> q-
<fantasai> ... that they apply as if they're at the top
<jensimmons> q+
<fantasai> ... so I think that would be a good restriction not just for the transition
<matthieudubet> yes it's also more obvious that the anonymous rule is after any declarations
<matthieudubet> than*
<fantasai> ... I think it's more understandable for anyone using nesting without knowing the details of the spec
<astearns> q- later
<fantasai> TabAtkins: no declarations allowed after rules is exactly a parser-switch of the kind the WG disliked
<fantasai> ... and there's two ways we could possibly do it
<fantasai> ... 1. We parse as today, you'll parse declarations properly but throw them out
<fantasai> ... but this means some garbage rule might be ????
<emilio> q+
<fantasai> ... 2. We do a stronger parsing switch, where we have a mode switch that changes how things are parsed
<astearns> s/garbage rule might be ????/garbage comment might be parsed as a rule/
<fantasai> ... this is more dangerous, might be possible to trigger this mode unexpectedly
<fantasai> ... and that's more complicated for us to be careful around in the future
<fantasai> ... so I don't think there's any actual benefit for instituting a parser switch
<astearns> q- later
<astearns> zakim, close queue
<Zakim> ok, astearns, the speaker queue is closed
<fantasai> TabAtkins: [something that went around too many times to capture]
<astearns> ack TabAtkins
<fantasai> plinss: I think we're getting off into the weeds of trying to mitigate these issues, all I'm asking for is a sanity check
<fantasai> ... if we look at all these issues, is it worth continuing down this path?
<fantasai> ... if so, we can continue to hash out the issues
<fantasai> astearns: Has anyone on the call been convinced that we should reconsider the path that we're on?
<astearns> ack jensimmons
<fantasai> ... if plinss is the only one, then perhaps we close this issue and open new issues about each individual small issue within the SASS approach
<fantasai> jensimmons: I think the fact that declarations that come after a rule are earlier in the cascade being earlier in the cascade
<fantasai> ... should add to the list
<fantasai> ... I'm more concerned about that that the other things
<fantasai> ... I don't believe we should throw out the declarations that come after the rules
<TabAtkins> ("declarations after rules" is an issue with any syntax, it's not specific to any one unless we totally prevent mixing syntactically)
<fantasai> ... I would prefer we investigate making the cascade *not* resorted
<fantasai> ... if that's absolutely impossible, then we have to teach it as "it's going to be confusing in the cascade, so sort to the top, this is the best practice"
<astearns> ack fantasai
<Zakim> fantasai, you wanted to support emilio's point
<jensimmons> q+
<plinss> q+
<astearns> ack emilio
<fantasai> emilio: I think the general approach we're taking is preferable as well
<fantasai> ... regarding Tab's comment
<fantasai> ... we have similar parser switches, e.g. @import rules not allowed after other rules
<fantasai> ... could easily implement as keep parsing as before, but throw out the declarations
<fantasai> ... user-wise, as a CSS author, I would prefer if this worked
<fantasai> ... and declarations were directly in the cascade
<fantasai> ... but if we're not doing that, which is also reasonable if we're concerned about perf or something, I think the restriction is not a huge deal
<TabAtkins> (note that Sass allows decls after rules, with exactly the same behavior as what we're specifying - all glommed together into the parent rule, preceding nested rules)
<fantasai> ... we can discuss in another issue at another point in time
<fantasai> astearns: I wanted to ask again, given the cluster of issues around the SASS-like syntax, are we convinced we're on the wrong path and should investigate alternative syntaxes?
<emilio> TabAtkins ugh, I think that's dumb :/
<fantasai> jensimmons: I think we're still going in the right direction, especially considering what we've heard from authors about what they want
<emeyer> I don’t know if we’re going the wrong direction, but I do think more investigation is needed.
<fantasai> astearns: My proposal is that we close this issue no change
<fantasai> ... and take each of these concerns and move them to other existing issues or new issues
<fantasai> ... so that we can work through solutions to these issues as much as we can in the framework we have
<emeyer> +1 to astearns’ proposal
<fantasai> ... would that be acceptable, plinss ?
<emilio> TabAtkins: but ok, I guess if authors are fine with that and sass doesn't have a lot of reports about that behavior... fine?
<fantasai> plinss: Yes. I just wanted us to look seriously at the big picture. Satisfied to go with the group consensus
<fantasai> ... But if we later find more issues, and suggest to go back and reconsider, we should reconsider
<fantasai> ... not say "despite new information, not going to reconsider"
<fantasai> ... at-rule we could deploy now, without working through these issues
<fantasai> ... but I'm not going argue this point anymore until new information comes up
<fantasai> astearns: proposed to clsoe this monster issue no change
<fantasai> ... and follow through on each individual issue on separate issues
<fantasai> ... any objections?
<fantasai> +1
<fantasai> RESOLVED: Close no change
<fantasai> ACTION: astearns to make sure smaller issues are followed up on
<fantasai> plinss: one other caveat, there might be someone else in the larger group that shares concerns, so maybe poll everyone?
<fantasai> astearns: I'll add a comment to the issue, here is what we resolved and why, and ppl can comment
<fantasai> plinss: I just want to make sure we make the right decision in the right way
<fantasai> INSERT replacing "ack fantasai" with -
<TabAtkins> new room topic?
<fantasai> fantasai: Stepping back to Plinss's question, I think we are on the right path. Considering developer ergonomics vs the downsides of this approach
<fantasai> fantasai: I think the best we can do is continue on this path and try to address the issues as we can
<fantasai> fantasai: I also wanted to support emilio's point about the cascade effects being confusing
<fantasai> fantasai: I think we should either make declarations after rules invalid (as emilio suggested) or sort them in the cascade as specified somehow (as jensimmons suggested)
astearns commented 1 year ago

Summarizing the resolution and discussion:

There are several minor issues with a SASS-style syntax for nesting we have identified so far. They include

We had consensus in the breakout session that even considering all of these issues together, we still plan on pursuing the SASS-style syntax. Working on solutions to these issues (where possible) will happen in separate issues.

But if there is anyone that was not in the breakout session who finds this set of issues a compelling reason to change course and NOT pursue SASS-style syntax, please do speak up here. And when new issues arise, we should take a moment to add them to what remains in this list as a checkpoint to reconsider whether the mass of minor issues has become too large.

Loirooriol commented 1 year ago

I wasn't in the session. I think these issues are concerning, and I prefer the so-called options 1 or 4 which don't seem to have these problems. But I won't object to the current thing.

BTW, about "Using SASS-like syntax, but not matching some SASS behavior", have the SASS developers said anything about this? They have requested CSS changes in the past when there was a clash, like renaming @if to @when.

SebastianZ commented 1 year ago

BTW, about "Using SASS-like syntax, but not matching some SASS behavior", have the SASS developers said anything about this? They have requested CSS changes in the past when there was a clash, like renaming @if to @when.

@mirisuzanne was on the call and didn't raise concerns regarding this, as far as I can tell. Though I may be missing earlier comments in which she expressed any concerns.

Sebastian

plinss commented 1 year ago

Two clarifications: 1) The cascading behavior is an issue for either an at-rule or the SASS-style approach, so it shouldn't be part of the consideration here (I think). All of the other issues go away entirely with an at-rule approach. 2) The proposal on the table wasn't to "NOT pursue SASS-style syntax" anymore, it was to not ship SASS-style syntax now. We have the option to ship an at-rule syntax now, and continue to refine the SASS-style approach over time, and then ship that when we feel it's ready. That effectively just makes the @nest prefix optional later (authors who prefer the explicit rule at-rule could still use it).

(And for those not on the call, I don't think any of the issues above individually are show-stoppers, it's just when you consider the complete list of the issues vs the advantages is when I start to have concerns. As with @Loirooriol above, it's not enough for me to object, but it was enough to go hmmm and ask the question.)

LeaVerou commented 1 year ago

I wasn't in the call (got stuck in traffic) but I personally still think we are on the right track, and I'm not concerned about most of these issues.

However, I would also be fine with a syntax that mandates that declarations have to precede rules, as I think that's good authoring practice in general (and would make the backtracking needed even more efficient, as it only needs to be done max once per rule). We decided against an author-facing parser switch (i.e. syntax authors would have to use to invoke the other mode). An automatic parser switch managed by the browser as it encounters syntax would be fine by me, and seems very easy to teach. It is also very compatible with allowing them to be intermixed in the future. Reading the minutes, @jensimmons brought up a concern that this would restrict use cases, but did not elaborate further (or wasn't minuted doing so):

jensimmons: I dont' like the idea of requiring that declarations go before nested rules ... like plinss says, people jam CSS into things all over the place ... it will make things less useful

I was wondering what cases you had in mind?

Also, could someone explain this further?

Unintuitive cascading behavior when declarations come after rules

Is this that in cases like these:

.foo {
    color: red;

    & {
        color: green;
    }

    color: blue;
}

You'd expect the color to be blue, but it's actually green?

jensimmons commented 1 year ago

@LeaVerou A discussion about "a syntax that mandates that declarations have to precede rules" needs to go in a new issue. This issue is only about whether/not we should stop moving forward with the current direction, and change course. Which we decided today on the call, no.

astearns commented 1 year ago

OK, here’s where I think we should move subtopics from this issue:

Issues with error-recovery -> https://github.com/w3c/csswg-drafts/issues/8349

Restrictions on future-syntax -> https://github.com/w3c/csswg-drafts/issues/8251

Using SASS-like syntax, but not matching some SASS behavior This has been discussed in a few issues like 2937 and 3748, where we have decided for the most part not to worry about this. If anyone is still concerned with the differences, please open a new issue.

Parsing changes are risky -> https://github.com/w3c/csswg-drafts/issues/7961

Unintuitive cascading behavior when declarations come after rules We have gone back and forth a few times on whether we should allow declarations after rules, and if we do how do we handle them (in the cascade and in CSSOM). I have not found a single issue that seems like an apt place to continue this discussion, so I think we should have a new issue on this.

tabatkins commented 1 year ago

Closing this issue, as it's been split into subtopics and generally resolved at this point.

(Hm, we don't have a great label for htis, so I'm gonna mark it as invalid, as I think that's closest to "no longer contains a relevant issue needing to be addressed".)