Open guitarino opened 2 years ago
This could be solved by several ways:
const person :: Person
"use type"
Is it wise to take TypeScript (or similar) as a basis for syntax reservation?
This proposal just reserve some syntax spaces and have no constraints on semantics, and I think such syntax spaces are enough for any possible type system. Choosing TS-compatible syntax spaces is just because it's the de facto of typing of JS ecosystem.
A new pragma, despite a few delegates mentioning it, is a nonstarter - there will be no new modes, and introducing one would fork the language far worse than the current situation with typed dialects.
Interesting point. Not tackling your argument as a whole, but I have a few nitpicks.
Why should JavaScript take TypeScript's
number
as the built-in way of representing numbers rather than something along the lines of Rust's (and AssemblyScript's)i32
,u32
,f32
,i64
,u64
,f64
Per my understanding of the proposal number
is just used as an example, and let ident: i32
would be just as valid with this proposal.
why should there only be syntax for dynamic-sized array like
Array<string>
/string[]
when there's an opportunity for fixed-sized array syntax
I think that it makes sense for Array<T>
to only represent a dynamically-sized array since the array is dynamically sized at runtime, and should probably stay that way.
I don't think it makes sense for Array
to be a type for a fixed length array. If, in the future, Array
could be fixed-length, then there's the confusing behavior that it sometimes fails on a .pop
or .push
, and sometimes succeeds, depending on whether or not it's fixed. Since you mentioned Rust I'll used it as an example: Rust has [T;N]
for a fixed-length array (N is the length) and the slice type [T]
for a length that isn't known at compile time. The languages I know that allow for fixed-size arrays/lists don't have an array type that can be both fixed or dynamic depending on the circumstances (at least in the standard library).
tl;dr If it's fixed-length, it should be something that doesn't have .push
, .pop
, etc., not an Array
, and I don't think this proposal needs to consider the possibility that Array
is fixed-length in the future.
Side-note: I feel like I should acknowledge that classes like Uint8Array
, Uint16Array
, etc. are basically fixed-length arrays AFAIK, although their use case is very specific.
But even if everything I said above about fixed-length arrays is false, If you wanted to represent a fixed-length array then I think that a syntax like let arr: Array<string, 5>
or let arr: string[5]
would still be valid per this proposal (they just wouldn't be valid TypeScript). Additionally there's the "tuple" type [T1, T2, ..., TN]
.
As far as I understand this proposal is not forcing TypeScript's types into JS, just trying to make TypeScript and other type systems valid JS (with some exceptions).
@spenserblack
Per my understanding of the proposal number is just used as an example, and
let ident: i32
would be just as valid with this proposal.
The proposal is to ignore types, so yes, under the proposal that would also be valid. However, as I said, this proposal will preclude type system from being built in the future. The proposal's alternative (if type system is to be eventually built) is to reserve proposed syntax. My comment is about this alternative, in which it will matter whether it is number
or f32
.
I think that it makes sense for Array
to only represent a dynamically-sized array since the array is dynamically sized at runtime, and should probably stay that way.
This type of discussion is exactly the type of discussion we should be having, rather than a discussion about reserving a specific type syntax. This discussion will necessarily involve different parties: we should bring different experts (e.g. V8 experts), and think of all the things that a type system could allow for in terms of optimization and improving the language (rather than keeping it the same).
As far as I understand this proposal is not forcing TypeScript's types into JS, just trying to make TypeScript and other type systems valid JS (with some exceptions).
That's a slightly misleading bit of the proposal. The proposal will preclude type system from being built altogether, unless specific syntax reservations are made, in which case, if the desire is still to stick with TypeScript as a basis of type system, it will "force TypeScript's types into JS".
Hey this is a duplicate and the rationale is in https://github.com/tc39/proposal-type-annotations#how-could-runtime-checked-types-be-added-in-future
The gist is type checking can be added in the future (with another syntax for example) but that's extremely unlikely given the cost penalty of performing runtime type checking. This is why most language other don't have any runtime type checking (like Python) but only introspection (which can be built on top of this proposal anyway).
I spoke with several language committees and designers about this. They were either against this sort of runtime checking or static types (because of cost) like in Python's case or mostly regretful about it like in PHP's case (the only language where there is this sort of type checking) since in practice people use tools like psalm meaning they use two syntaxes (comments and types) and most interesting types don't get checked at all in runtime but by a third-party tool.
I'd like to copy my comment at https://github.com/tc39/proposal-type-annotations/issues/41#issuecomment-1082618637
It is impossible to add runtime type check into JavaScript because there is an impossible triangle in this problem:
If the type system needs to be useful in the long term, it should reject invalid programs (in the type checking semantics). To achieve this, you need one of the...
@benjamingr
Could you elaborate on how runtime checking could be added after this example becomes a valid JS that runs without error?
interface Person {
firstName: string,
lastName: string
}
const person: Person = {
oops: "This does not follow a type",
}
@Jack-Works
I'm trying to understand why a future type-checker would necessarily break web compatibility. If type annotations are a separate new syntax (which gets introduced often in JS), that means that in order to use type checker, a new syntax would be used, and since old programs don't use the new syntax, they won't break.
@guitarino once runtime types are checked, no changes can be made to any part of the language that would violate existing types - it would likely prevent additions to anything.
Could you elaborate on how runtime checking could be added after this example becomes a valid JS that runs without error?
Sure though that would never happen for the aforementioned reasons (the performance numbers don't make sense which is why no other language does this fully and the one who does it partially (php) isn't happy about it):
interface Person {
firstName: string,
lastName: string
}
// example possible syntax to check
const person:: Person = { // throws a type error because of the ::
oops: "This does not follow a type",
}
@ljharb user-defined types will be able to change, right? I don't see any problem with built-in types being fixed.
@benjamingr would you also have to write
::interface Person {
firstName: string,
lastName: string
}
Such that it doesn't interfere with a syntax like
interface Person {
this will be ignored
}
At which point, you have 2 completely independent syntaxes: one for ignoring types, another for type-checking. They could vary and be completely incompatible:
::interface Person {
age: i32 // if we decide to have more optimized types
}
interface Person {
age: number
}
There are 2 main points that I believe have ground and are not a duplicate:
@guitarino once runtime types are checked, no changes can be made to any part of the language that would violate existing types - it would likely prevent additions to anything.
There are a large amount of static typed programing languages, and they are constantly introducing new things. In addition, since JS has no ABI compatibility issues, it will be much more easier.
@ljharb another way of adding new built-in types as the language evolves could come from this proposal.
If adding new built-in types as the language evolves isn't what you meant, I'd love to know more about what you meant. If you meant something like ABI issues, I think @CarterLi makes a great point.
I think the key thing to note here is that JS isn't like other statically typed programming languages, because of the structure of TC39, which requires (approximately) 100% consensus to change the language.
The subject line of this issue says that type syntax "cannot" be separated from future type-check implementations, but later in the initial post, @guitarino clarifies that it just "seems backwards" to do it that way. If you assume that TC39 will inevitably approve a runtime-checked type system, then, sure, it's silly not to start standardizing that now. But that assumption is unfounded. I think there's every reason to think that TC39 will never approve of runtime-checked types, for performance reasons alone, as the FAQ explains:
It seems unlikely that JavaScript would adopt a pervasive runtime-checked type system due to the runtime performance overhead that it would bring. While it's possible that there may have been room for improvement, past efforts such as TS* (Swamy et al) have shown that runtime type-checking based on annotations adds a non-negligible slowdown.
So, uh, yeah, we could reserve a ::
syntax for the day when TC39 agrees to standardize on runtime-checked types, and, sure, I suppose we'd have to do ::interface
, too, and that sounds kinda ugly, but I wouldn't stress about it, because that day will never come.
We should reserve an "escape hatch," just in case everyone at TC39 completely changes their minds about the value of program correctness over web performance, but it's an escape hatch that will literally never be used.
There are a large amount of static typed programing languages, and they are constantly introducing new things. In addition, since JS has no ABI compatibility issues, it will be much more easier.
Other statically typed programming languages have versions, and newer "major" versions are allowed to declare that code from older versions is invalid.
But there are TC39 committee members who are absolutely committed to not "versioning" JS. As @ljharb wrote, the pragma idea "use ES2025" is a "nonstarter" in this sense: someone on TC39 will object to it, and so, it won't happen. This conversation has come up over and over within TC39, and if the first step of your plan is, "we'll convince every objecting TC39 member that versioning is actually OK," then you need a new plan, because it ain't gonna happen.
Now, you might be saying, "well, that's fine; we'll just pick the right type semantics and never change it." Thinking about typical TypeScript types today, where developers usually just define Person
interfaces with mandatory attributes and assign non-generic types to variables and parameters, maybe that could be standardized and never changed.
But look at how TypeScript and Flow have developed over the years. As major versions have launched, TS and Flow have added not just new syntax, but new checks, that detect bugs in existing programs. You might say "well, who cares, let's declare TypeScript DONE and then standardize that," but TypeScript isn't nearly "done." There's gonna be a TS 5, and a TS 6 after that, and they're going to introduce breaking changes each time. If you'd said that TS 3 was "done," then you would have prevented these breaking changes in TS 4, say nothing of whatever breaking changes are coming in TS 5.
And wait, should we standardize on TS or Flow? Or wouldn't it be better to standardize on Hegel, which guarantees that there will be no runtime type errors (unlike TypeScript)? If we standardize on semantics like TS 4, we'll be prevented from ever standardizing on a strong type system that Hegel provides.
The FAQ argues that "trying to add a full type system to JavaScript would be an enormous multi-year effort that would likely never reach consensus," which is why the goals of this proposal are more modest: to allow developers to implement type annotations without "forking" JS. (TS, Flow, and Hegel are all "forks" of JS in this sense.)
@dfabulich
Thanks for chiming in.
I do believe that versioning type-checker could be done without forking the language or "use es####" pragma. There's a separate discussion going on here: https://github.com/tc39/proposal-type-annotations/pull/135 where I provide some possibilities, and I don't see why things like inference couldn't be versioned similarly if you introduce Auto
type or the like. As long as every feature of the type-checker is opt-in and you specify which version of that feature you'd like to use in a particular file/module, it's quite straightforward - but there needs to be a collaboration and thinking about how that could be done rather than resistance to the idea.
As a community, I think we should do our best to avoid a nocebo effect and, instead, put all possible effort into creating a type system, since we have a group of passionate people who care about JS and types, and there's never been a better opportunity for that.
Here's some things / features that a good type system may enable:
There's probably a lot more benefits, and we should ask ourselves, are these benefits worth it? Is it worth to make a genuine and inspired attempt to overcome difficulties in order to achieve these benefits for everybody? Is it worth doing it to continue developing JS as a modern language, for current and future generations of developers? Or should we give up and make these benefits harder to achieve instead?
I like your optimism, but your optimism is misplaced here, especially around using types to optimize performance.
Google's V8 team (the team that owns Google Chrome's JS engine implementation) floated a proposal back in 2015 to implement a new JS mode that they called "Strong Mode."
The idea was that you'd opt-in with a pragma like "use strict", but this time you'd "use strong". Strong mode would include a bunch of restrictions on how you'd write your JS. For example:
delete
properties from a classHere's Google's 2016 announcement canceling the "strong mode" experiment. https://groups.google.com/g/strengthen-js/c/ojj3TDxbHpQ
The main problem here was the interoperability with weak classes: we need to allow inheriting strong from weak classes and vice versa (or even alternating inheritance chains), because ruling that out would severely limit the utility of strong mode. But we could not find a semantics that would work without breaking either weak parent classes or weak child classes. In the end we had to give up on this one.
And so, in practice, for the next five to ten years, we're either getting something pretty much like this "types as comments" annotations proposal, or nothing at all.
The question to ask ourselves isn't "are these benefits worth it?" The question to ask is, "Which of these benefits are achievable at all?" And the answer is "very, very little."
As a result, it's up to each of us to decide whether to accept small improvements (accepting that major changes are impractical) or give up on small improvements in favor of large improvements that literally never materialize, or to embrace nihilism and give up on improvements altogether.
Also, I've replied to #135. It proposes a way for developers to opt-in to breaking changes; but that's exactly what anti-pragma TC39 members are opposed to. It's a total nonstarter.
As long as every feature of the type-checker is opt-in and you specify which version of that feature you'd like to use in a particular file/module, it's quite straightforward
Versioning has already been shot down, but I feel it's worth noting that opting in on a per-feature basis isn't a viable substitute. In order for that to work, any future fixes to bugs or spec deficiencies in type checking have to be opt-in as well. Otherwise, fixing those bugs breaks backward compatibility for the web. The web probably doesn't benefit from a situation where developers have to manually opt into valid and sensible behavior with a pile of flags and declarations that grows increasingly large as time passes.
Essentially, your idea (in combination with the committee's self-imposed constraints e.g. avoiding language versioning) hinges on the committee never making mistakes in the future. With all due respect to the committee, I think there's already evidence enough -- including, in my opinion, the existence of this whole proposal in the first place -- that that's not to be relied upon.
As some others have mentioned, if the proposal is implemented in one form or another, it will preclude the possibility of a future type system. The argument is pretty simple. Take this code:
Suppose the proposal is implemented. This means that
interface { ... }
and: Person
is ignored as comments. This proposal will make this code a valid JavaScript. Since JavaScript is intended to be backwards compatible, the code will need to continue to be valid JavaScript that runs without errors. But you may've noticed this code has a type error (person
is not really initialized asPerson
) - which means that code with type errors will need to remain valid JavaScript that runs without errors - hence, preventing possibility of future type checking.Is it wise to take TypeScript (or similar) as a basis for syntax reservation?
In the proposal, it is mentioned:
It sounds backwards to me to try to reserve syntax before knowing details about the future runtime type checking, a feature that could be immensely useful and has the potential to move the language significantly forward. For example, in order to make the right syntax reservations for type-checking, we will need to know about future built-in types, in order to know which types should be reserved as built-in.
Here are examples of specific questions I'm concerned about when it comes to syntax reservation. Why should JavaScript take TypeScript's
number
as the built-in way of representing numbers rather than something along the lines of Rust's (and AssemblyScript's)i32
,u32
,f32
,i64
,u64
,f64
, etc, which could have an additional benefit of optimization? Similarly, why should there only be syntax for dynamic-sized array likeArray<string>
/string[]
when there's an opportunity for fixed-sized array syntax (which is probably reasonable to unify with dynamic-sized syntax)? Why should a future JavaScript type system inherit design choices made by TypeScript (and Flow), such as the existence of bothinterface
andtype
ways of defining types?In my view (and I'm willing to be convinced otherwise), TypeScript was designed with a desire to describe existing JavaScript rather than trying to move the language forward. If TypeScript-like type syntax is reserved within JavaScript, it will prevent the language from moving forward in terms of type-based optimizations, and has the potential to carry over TypeScript's questionable design choices that lead to unnecessary confusion when learning the language. It is also unclear to me why syntax reservations should be made before a proposal for the detailed type-checking system is submitted - this should be done the other way around.