Open srawlins opened 5 years ago
CC @leafpetersen @munificent
I don't believe there is any good solution to this that doesn't require the code generators to generate constraints. The reason is that the NNBD release version will be a backwards incompativle version, so you have to opt out of it. All existing packages will have an SDK version below the NNBD version (let's assume 2.5, although it'll probably be later), so they are opted out by default. When they then try to opt in, changing their default language level, their generated code is no longer correct, and it doesn't have the language level downgrade markers. That is an issue.
If the generated code is part files, then the main library can specify a lower language level that the part will inherit. If the generated code are library files, then the only real option is to generate them into a separate package with a lowed SDK version.
This is all very fragile. The stable solution would be the code generators adding language-level markers. That also means that the generators are locking themselves to a specific language level (unless it's an option, and they can generate multiple versions, which is definitely a possibility).
Generators that generate part files should be able to read the language level from the main file to decide what to generate in the part; this seems reasonable. They'll need to keep the facility to generate both around for a while.
I don't believe there is any good solution to this that doesn't require the code generators to generate constraints.
What is the good solution that does require code generators to generate code constraints? The problem is that published versions of code-generating packages are already in the wild, claiming to support Dart SDKs up to 3.0.0
Good point. We might, as you say, want to retroactively change the dependency ranges for codegen packages from <3.0.0
to <2.5.0
.
Do we have an existing mechanism for doing that?
Do we have an existing mechanism for doing that?
No, and we probably shouldn't ever have one. Any change to an existing package on pub is extremely dangerous.
No, and we probably shouldn't ever have one. Any change to an existing package on pub is extremely dangerous.
What about deleting a version? For example, built_value_generator's latest published version, 6.7.0, supports Dart up to 3.0.0. What if David made one change to that code, reducing the SDK max to <2.5.0
, then published that as built_value_generator 6.7.1, then told pub to delete published version 6.7.0? We'd have to remove other 6.x.x releases as well, and maybe backfill with more fixes on some releases. *cringe*
I'm not super suggesting this idea; it seems very unwieldy and coarse, and dangerous. But it would work, no?
We have deleted versions in the past but the process is intentionally difficult and manual because of how risky it is. We essentially only do it when a package is accidentally published and contains some code that was intended to private, or things along those lines.
@munificent Was discussing the possibility of changing pub to encourage upgrading packages when the sdk was updated, which might help with this.
It sounds like what we actually need is more specific ... a pub command for:
restrict the SDK upper bound for all releases before this one, to at most the SDK upper bound for this one
--how else could people deal with minor SDK version bumps that turn out to be breaking for their package?
restrict the SDK upper bound for all releases before this one, to at most the SDK upper bound for this one
How is that any different than just doing a major version bump for the Dart SDK?
😁 I think that putting language version boundaries on major releases solves so many problems, and causes... none? marketing problems?
:D yes. The problem we have is that what for almost everyone is a minor version bump, acts like a major version bump for code generators.
So unless we want to release Dart 3 real soon, we need a special solution for the code generators.
restrict the SDK upper bound for all releases before this one, to at most the SDK upper bound for this one
How is that any different than just doing a major version bump for the Dart SDK?
If I understand what @davidmorgan is saying, the difference is that a major version bump of the Dart SDK means that no existing package works with new SDKs, whereas this would just say that this specific package doesn't work with new SDKs.
Another way of saying this is that we could have a different kind of SDK upper bound, which corresponds to "language version" instead of SDK version. And so you have the option of saying that the upper bound of my constraint is not < Dart 3.0, but < the next breaking language change.
I'm not advocating this, by the way - we'd like to avoid this kind of complexity of possible. But I think that's the gist?
😁 I think that putting language version boundaries on major releases solves so many problems, and causes... none? marketing problems?
Well, releasing a major SDK version bump means that off the bat, no packages whatsoever work. So if you want to migrate, you have to migrate all of your transitive dependencies first.
This mechanism was our attempt to find a way to allow an incremental phase in of the breaking change. Instead of "hey switch to NNBD, and by the way you have to migrate everything you depend on first, and everything you depend on won't be able to use your new versions until they migrate", we can say "hey switch to NNBD, and it will just work, and it will just get better as things you depend on migrate, and your customers (whether migrated or not) can continue to use your new versions".
Now maybe we won't be able to make this work out long term - maybe there are too many ugly corners like code generation, and we'll have to use language versioning only for non-breaking changes. If so, well, we'll just have to make it work. But my sense is that even if there are some sharp edges here, this may still provide enough value to justify it?
Another way of saying this is that we could have a different kind of SDK upper bound, which corresponds to "language version" instead of SDK version. And so you have the option of saying that the upper bound of my constraint is not < Dart 3.0, but < the next breaking language change.
To me this is effectively the same as the following policy: If you are authoring a package, and are currently using Dart x.y.z to author and test it:
<x+1.0.0
.<x+1.0.0
.x.y+1.0
.
<x.y+1.0
.x.y+1.0
, which is not associated with a new language version, go ahead and release a new version of your package which supports <x.y+2.0
.x.y+3.0
, which does include a new language version, you must evaluate it. Maybe it changes the precedence of async
or ..
. If your code generator never generates code which would now be interpreted differently (or not parse correctly), you ahead and publish a release which supports this Dart (and Dart language) version. Otherwise, you have work ahead. No one can use your package to generate part files for Dart x.y+3.0
until you write the support, and publish a new release.Would this be a viable policy for code generators? It's sort of broken for the next few months while everyone claims to support <3.0.0
, but after one round of fixes, it would be a stable policy post-NNBD.
Well, releasing a major SDK version bump means that off the bat, no packages whatsoever work. So if you want to migrate, you have to migrate all of your transitive dependencies first.
That's true, and not fun. With an outsider's perspective, I think Node has been fairly successful with this. They release a new major Node which is mostly not a breaking change. It has breaking changes, but they don't apply to most packages, so most packages can publish a new release with a just a new upper bound on Node version. I'm not sure how painful/painless this is in practice, because the community needs to be pretty active. If NNBD were Dart 3.0, the majority of packages could ship a little release saying they support >=2.0.0 <4.0.0
, but code builders could not.
Now maybe we won't be able to make this work out long term [...], and we'll have to use language versioning only for non-breaking changes.
The spec is mostly written in terms of breaking changes. Why would you need a new language version for a non-breaking change?
Would this be a viable policy for code generators?
Yes, I think this is one approach we could take.
The spec is mostly written in terms of breaking changes. Why would you need a new language version for a non-breaking change?
To solve the problem of people publishing packages that use a non-breaking feature added in version x.y.z without updating their min SDK constraint to >= x.y.z . The language versioning proposal makes it an error use non-breaking features released in x.y.z unless your min SDK constraint is >= x.y.z.
Would this be a viable policy for code generators? It's sort of broken for the next few months while everyone claims to support
<3.0.0
, but after one round of fixes, it would be a stable policy post-NNBD.
The problem is all the existing generators on pub which will still have <3.0.0
. I misunderstood the comment from @davidmorgan earlier - I think he meant having a pub command to retroactively change those constraints. I really don't think we should do that though. Published packages on pub should not be altered after the fact, and its a huge red flag that we would even be considering this.
@grouma and I just talked through briefly on the whiteboard how we might capture the meta-constraints for code generators in pubspecs.
Suppose we added an optional field to pubspec.yaml of the form codegen: '>=2.5.3'
indicating that the package generates code that is marked with "@dart=2.5.3" and hence is compatible with any SDK version >= 2.5.3.
We could then make pub version solving reject any solution which includes a package with a min SDK constraint before 2.5.3 (i.e. that are opted out). This works but is highly pessimistic - lots of packages that aren't the targets of codegen will still be required to have a min SDK constraint >= 2.5.3 (and if they don't have such a version, version solving will fail).
If we could know which packages are the targets of codegen, then we could prune this down to only the packages actually affected, but we don't.
It seems possible that we could do this check more lazily as part of codegen, or in the build runner, where we know the targets, but this still seems less than ideal.
I re-opened the discussion of allowing libraries to be opted forward. See this comment for a summary of why this might help with this issue.
I may have missed some of the nuances here, but I believe we have two major cases as listed here, and with some suggested resolutions to those:
App without null safety, generator with:
App with null safety, generator without
pub outdated
for locating versions that support null safety (https://github.com/dart-lang/pub/pull/2315)Did I get that right?
I may have missed some of the nuances here, but I believe we have two major cases as listed here, and with some suggested resolutions to those:
App without null safety, generator with:
- App hasn't opted-in to null safety, but (by accident) takes a dep. on a code generator that has opted-in to null safety
Small clarification but this doesn't need to be by accident. In general a user who has not opted in should not care whether packages they depend on have opted in or not - and the goal here is to essentially allow code generators to fall into that same bucket even if they generate code in the root package.
- App might be run with a older SDK not supporting null safety
The code generator must still have a min SDK which is high enough to opt into null safety. So a user on an older SDK will not pick up that version of the code generator.
- Solution: Allow generators to opt forward to signal their use of null safety (#287 (comment)), and have pub ensure that you cannot publish a package that uses such generated code without itself using an updated SDK (dart-lang/pub#2316)
Correct
App with null safety, generator without
- App opts-in to null safety, but has a dep. on a code generator that doesn't support it
- Generator generated code that isn't null safe; developers sees lots of issues
- Solution: Developers should upgrade to a generator that supports null safety. They can determine which version that might be using pub.dev (dart-lang/pub-dev#3354) and
pub outdated
for locating versions that support null safety (dart-lang/pub#2315)
Or the generator downgrades their language version explicitly. This is safe and the code is allowed to be published on pub.
Most code generators that generate code to source will likely do this for a while I think to avoid issues when users try to publish their code but haven't migrated to null safety yet, as this is forwards and backwards compatible.
Let's take a concrete example.
I have myapp
which requires sdk: ^2.6.0
. It dev-depends on mygen: ^1.14.0
.
Now someone publishes mygen
version 1.15.0 which depends on sdk: ^2.10.0
and which generates null-safe code with a //@dart=2.10
marker.
If I run myapp
on a 2.10.0 SDK, starting with pub get
, I can get mygen
version 1.15.0 and have language v2.10 code generated for me.
This runs fine because it happens on a 2.10 SDK.
I cannot, and indeed must not, publish myapp
as a pub package containing that generated code. The code is not supported by my minimum SDK dependency, and if mygen
is only a dev dependency, then someone might run myapp
on a 2.7.0 SDK and fail.
That's a Pub issue, not a language issue, the 2.10.0 SDK happily runs the code, and it would not have been using the 1.15.0 mygen
if it couldn't because of its SDK requirement.
So, this allows local code-gen to work only restricted by the current local SDK version. It allows the generator to restrict itself to an SDK which can run the code it generates, and it allows that code to be generated into a package which requires less. As long as the code is never published so it might be run by a different SDK, all is well.
That's why I think there should be a at-publish check to see that you are not using features newer than the SDK you require. Until you publish, it is not a problem.
Also, even non-null safe generators should still write language-version markers into all their generated files.
The real trick here is generating a part file for a hand-written library file. In that situation, you really do need to tell the generator which language version to generate. For such generators, it may make sense to have different major versions of the generator package corresponding to the Dart language version they generate, so you can dev-depend on the one you want. Well, at least if you don't want to generate different language versions in the same package. :)
That's why I think there should be a at-publish check to see that you are not using features newer than the SDK you require. Until you publish, it is not a problem.
The pub team has agreed to that; see https://github.com/dart-lang/pub/issues/2316
I was trying to figure out if we had other issues unaccounted for than then ones I listed in https://github.com/dart-lang/language/issues/354#issuecomment-589092350.
The part file issue that @lrhn brings up is really the last issue and there isn't any good solution we have to it that I am aware of. This is also a large portion of the code generators today.
Even relying on selecting a certain version of a generator doesn't solve it because different libraries in the same app might be on different language versions, but there is only one version of the generator allowed.
@kevmoo maybe we should look at this from a json_serializable
perspective and see how we might solve this problem for that package, and see if we can come up with something generalizable?
According to the Dart Language Versioning spec, individual libraries can override the language version specified by the
sdk
minimum constraint in apubspec.yaml
file by writing a comment at the top of a file:and, if I am reading it right, under "Non-
package:
Libraries, "Any tool [...] should treat a file in the same pub package as having the same default language level as the actual package files [...]". So this would include generated files, such as files generated by the build_runner package. This may pose a problem when a version of Dart is released which makes available an experimental flag which interprets a library in a different, breaking way than the previous version.OK, let's get concrete. Let's say that the NNBD feature first "ships" (meaning the experimental flag is true by default, I think, which enables the SDK constraint and library overrides to dictate NNBD-ness) in Dart 2.5, and a developer working on a package, foo, which depends on the intl_translation package. All of the latest versions of intl_translation claim to work with the Dart SDK
>=2.0.0-dev.33.0 <3.0.0
, a perfectly sane and advised version range.pub get
, version 0.17.4 was locked into theirpubspec.lock
file.pub get
, and pub does not automatically try to upgrade intl_translation (the current version satisfies all constraints).pub run test
before trying NNBD; everything looks good.>=2.5.0 <3.0.0
, which opts-in to NNBD across all files in their package.pub run intl_translation:generate_from_arb ...
, which generateslib/foo_all_messages.dart
. This file has no Language Version Override.lib/foo_all_messages.dart
, which may feature code likeString a = null;
.The built_value code generator presents a slightly different example: it generates parts of non-generated libraries. The developer of a package which uses built_value writes a library like:
and the built_value package generates the
person.g.dart
file via the build_runner package. According to the spec,which will lead to the same compile-time errors while trying to compile
person.g.dart
. (Even if https://github.com/dart-lang/language/issues/318 is accepted, the part files will be opted-in by default, and when the developer of foo tries to override Language Version inperson.dart
, theperson.g.dart
file will not have a@dart
comment, invalidating the library even earlier.)Some Solutions
// @dart 2.1
, then the developer can possibly runpub upgrade
to fix their problem (possibly not, if constraint solving, for some other reason, does not allow the new versions with Language Version overrides). Otherwise, they thought they could experiment with NNBD before their "dependencies" opted-in to NNBD, but they actually cannot, until new packages are published.Features "like NNBD" could ship in major version numbers like 3.0.0. I think that by "like NNBD," I mean features that interpret previously-valid code differently from previous versions of Dart, e.g. if we scrap existing enums for a new enum syntax, or remove
dynamic
, etc.Actually, only the first feature like NNBD would have to be shipped in a major version like 3.0.0, and after that, all code-generating packages will be shipping code actively opting-in or -out of versions. The problem exists because packages currently published are missing Library Version overrides.
<2.5.0
instead of<3.0.0
.