dart-lang / sdk

The Dart SDK, including the VM, JS and Wasm compilers, analysis, core libraries, and more.
https://dart.dev
BSD 3-Clause "New" or "Revised" License
10.07k stars 1.56k forks source link

[Breaking Change] Make the context for await expressions consistent #55418

Closed stereotype441 closed 4 months ago

stereotype441 commented 5 months ago

The context used by the compiler front end to perform type inference on the operand of await expressions will be changed to match the behavior of the analyzer. The change is as follows: when the context for the entire await expression is dynamic, the context for the operand of the await expression will be FutureOr<_>.

Although this is technically a breaking change, it's not expected to have any effect on real-world code.

Background

When the compiler needs to perform type inference on an expression, it does so using a type schema known as the "context". A type schema is a generalization of the normal Dart type syntax, in which _ (called the "unknown type") can appear where a type is expected.

In the analyzer, any time an expression would be analyzed with a context of dynamic, the context is coerced to _ before performing the analysis; this causes contexts of dynamic and _ to behave identically[^1]. In the compiler front end, dynamic is coerced to _ when analyzing a generic invocation, but dynamic and _ behave differently for a few expression types. This breaking change addresses one of those differences, which is in the behavior of await expressions.

[^1]: This coercion doesn't happen when dynamic appears more deeply inside the context; for example, a context of List<dynamic> is not changed to List<_>.

The current behavior for await expressions is as follows. If the context for the await expression is K, then the operand of the await expression will be inferred using a context of L, where L is computed as follows:

  1. If K is FutureOr<P> or FutureOr<P>? for some P, then L is K.
  2. Otherwise, if K is dynamic, then in the compiler front end, L is dynamic; in the analyzer, L is FutureOr<_>.[^2]
  3. Otherwise, L is FutureOr<K>.

[^2]: This is a consequence of the fact that the analyzer coerces dynamic to _, therefore rule 3 applies.

Intended Change

The above rules will be changed so that in the compiler front end, if K is dynamic, then L will be FutureOr<_>, as it is in the analyzer.

Justification

Any difference in type inference behavior between the analyzer and the compiler front end is a bug. In this case the bug has a low user impact, because the type schemas dynamic and FutureOr<_> behave very similarly in type inference (see https://github.com/dart-lang/language/issues/3649 for further discussion about this). To reduce the impact of the bug fix, it makes sense to standardize on either the analyzer's behavior or the compiler front end's behavior. In this case, standardizing on the analyzer behavior is better, since the analyzer is more self-consistent (it always treats contexts of dynamic and _ the same). Once this change is made, there will be only one remaining scenario in which the compiler front end treats dynamic and _ contexts differently, which I plan to address in a future breaking change (see https://github.com/dart-lang/language/issues/3650).

Expected Impact

A prototype of this change caused zero test failures in Google's internal codebase, so the impact is expected to be extremely low for real-world code.

But it is theoretically possible for a program to behave differently with the change. Here is a contrived example:

Future<T> g<T>(T t) => Future.value(t);
T h<T>(T t) => t;

extension<T> on Future<T> {
  void foo() {
    print(T);
  }
}

test(Future<num>? f) async {
  dynamic x = await h(f ?? (g(0)..foo()));
}

main() {
  test(null);
}

Today, this program prints num; with the change, it will print int.

Today, the compiler front end performs type inference for this program as follows: - `await h(f ?? (g(0)..foo()))` is inferred using a context of `dynamic`. - Therefore, `h(f ?? g(0)..foo()))` is inferred using a context of `dynamic`. - Since `h` is a generic invocation, the context `dynamic` is changed to `_`, so `f ?? (g(0)..foo())` is inferred using a context of `_`. - When an if-null (`??`) expression is inferred using a context of `_`, the static type of the left hand side (`f`) is used as the context for inferring the right hand side (`g(0)..foo()`). - Therefore, `g(0)..foo()` is inferred using a context of `Future?` (the static type of `f`). - Therefore, `g(0)` is inferred using a context of `Future?`. - Since the return type of `g` is `Future`, and that satisfies `FutureOr?` only if `T <: num`, the type of `T` is set to `num` during downwards inference. - Therefore, `g(0)` has static type `Future`. - Therefore, the extension method `..foo()` is invoked with the type parameter `T` bound to `num`. With the change, here's what will happen instead: - `await h(f ?? (g(0)..foo()))` will be inferred using a context of `dynamic`. - Therefore, `h(f ?? g(0)..foo()))` will be inferred using a context of `FutureOr<_>`. - Therefore, `f ?? (g(0)..foo())` will be inferred using a context of `FutureOr<_>`. - When an if-null (`??`) expression is inferred using a context *other* than `_`, that context is propagated to the right hand side (`g(0)..foo()`). - Therefore, `g(0)..foo()` will be inferred using a context of `FutureOr<_>`. - Therefore, `g(0)` will be inferred using a context of `FutureOr<_>`. - Since the return type of `g` is `Future`, and that satisfies `FutureOr<_>` for all `T`, downwards inference of `g(0)` won't constrain the type of `T`. So the type of `T` will be set to `int` during upwards inference. - Therefore, `g(0)` will have static type `Future`. - Therefore, the extension method `..foo()` will be invoked with the type parameter `T` bound to `int`.

Mitigation

In the unlikely event that some real-world customer code is affected, the effect will be limited to type inference. So the old behavior can be restored by supplying explicit types. For example, the above example could be changed to:

Future<T> g<T>(T t) => Future.value(t);
T h<T>(T t) => t;

extension<T> on Future<T> {
  void foo() {
    print(T);
  }
}

test(Future<num>? f) async {
  dynamic x = await h(f ?? (g<num>(0)..foo()));
}

main() {
  test(null);
}

(Note that g(0) has been changed to g<num>(0).)

itsjustkevin commented 5 months ago

@vsmenon @Hixie @grouma can you take a look at this breaking change request?

Hixie commented 5 months ago

Sounds good to me; the change seems like an improvement.

stereotype441 commented 4 months ago

@vsmenon @grouma ping

grouma commented 4 months ago

cc @leonsenft who will handle breaking change requests for ACX going forward.

leonsenft commented 4 months ago

LGTM 👍

vsmenon commented 4 months ago

lgtm

itsjustkevin commented 4 months ago

@stereotype441 your breaking change request is approved!