Open getify opened 3 years ago
As a sort of illustrative aside, I have adopted a number of programming patterns (built on libraries I've authored) in recent years that are heavily focused on FP techniques, especially monads of late. I'm well aware that despite any best efforts to optimize the implementations of these libraries, usage of them comes at a performance cost. The low-level non-FP non-monadic way of doing a task is almost always going to beat, and in some cases, significantly beat, these more recently adopted approaches.
The fact that in JS you have the option to a variety of paths on the style/performance spectrum, according to the needs of any given parts of a program is, IMO, one of JS's greatest strengths. But not all code paths are critical code paths. In fact, most aren't. So I choose to continue use coding styles that I know are less efficient because raw performance, in most of my code, is not the primary concern.
Following coding conventions and standards that are deemed to be more "proper" (for varying definitions of that word), hoping that it avoids bugs and creates a more expressive and readable and maintainable program -- these are overriding concerns compared to a few dozen microseconds longer execution time. Only if I see a truly hot code path being significantly harmed would I consider optimizing away these patterns I've come to trust and rely on.
My point? Developer intuition and expected patterns (even as inspired by experience from other languages) should be favored, or at least given more weight than it seems to be getting, for the majority of lines in a program. Just because Type II subclassing may make code a bit slower doesn't mean it's in any way a bad choice for a majority of code. It might very well need to be avoided in some limited hot paths, but that should be up to the author, not the language steering committee.
Instead of removing them as a whole, why not revisit all built-in classes case by case? For example, extending Array is very common but I never see someone extending RegExp class in wild.
Extending Promise
also seems somewhat common -- given the nature of not being able to access some of the private slots/state of the parent Promise
implementation, it's even more critical to be able to have inherited methods vend the subclass rather than parent to accomplish some tasks.
For example, rumblings about something like CancelablePromise
occasionally bubble up, and in such a case, it would be relevant whether a then(..)
method on such a child class were going to vend a Promise
or a CancelablePromise
.
To be clear, I don't care about @@species
per se, but having the ability to control Type II style extension somehow is important from my perspective.
Thanks for the write up here, I'll think on it some before responding to the points of the utility of Type II outweighing the implementation complexity cons.
I'd like to clarify, however, that the main motivation here is not avoiding "a few dozen microseconds longer execution time", but rather security and maintainability of correct implementations. As a browser vendor, I tend to care about serving users over developers. Given the history of security bugs here, that has made me think that increasing the likelihood of secure and correct implementations should in fact be weighed higher than enabling the subclassing intuition.
Would a simplified version of Type-II be able to supported, e.g. the current Type II involves looking up this.constructor
every call. I don't know about engine internals, but would it be easier for an implementation to support a simplified version that captures new.target
only once and remains consistent per instance.
For example at current this works:
class Baz extends Array {}
class Bar extends Array {}
class Foo extends Array {}
const foo = new Foo();
foo.map(a => a**2) instanceof Foo; // true
Foo.prototype.constructor = Bar;
foo.map(a => a**2) instanceof Bar; // true
foo.constructor = Baz;
foo.map(a => a**2) instanceof Baz; // true
With my proposed change (capture new.target
once) the second two cases would be false
, and instead foo.map(...)
would always be Foo
once it's been constructed.
An alternative again that would also support Type III by consequence, would be to pass a callback to super()
that asks it to construct a new instance. This might be simpler than delegating to both this.constructor
/this.constructor[Symbol.species]
but I'm not sure.
e.g.:
class Foo extends Array {
#someData;
constructor(someData) {
super(() => new Foo(this.#someData));
this.#someData = someData;
}
get someData() { return this.#someData; }
}
The latter can't work unless Array
also took a thunk, which it doesn't seem like it would make sense for it to do.
The latter can't work unless
Array
also took a thunk, which it doesn't seem like it would make sense for it to do.
If there wasn't already legacy behaviour, then yes the array constructor taking a thunk would be preferable for symmetry. But new Array(() => {})
is already a thing, and I doubt this would be compatible to change.
It could still be possible though to detect it when new.target !== Array
, I doubt there's any real usage of super(someFunc)
, but that would need to be checked.
e.g. If implemented in JS it could check the new.target
and if so treat the first argument as a thunk (if it's a function).
class Array {
#createInstance = () => new Array();
constructor(value) {
const initialSize = typeof value === 'number' ? value : 0;
if (new.target !== Array && typeof value === 'function') {
this.#createInstance = value;
}
}
}
Mind you, I'm still not super convinced by the need for Type III, it was mostly an idea for Type II to wire data back into the new instance (e.g. setting up private fields from the transformed instance without needing to override every single method).
I can see and support the arguments for removing Type III and Type IV subclassing. But removal of Type II subclassing, IMO, significantly neuters the idea of subclassing altogether. I would expect to abandon any efforts to use subclassing myself if Type II were removed.
The fact that the decision criteria is largely based around security and web-compat is relevant, but also (IMO) short-sighted. Not enough weight is given to coherence of a feature and typical developer intuition -- which only gets a passing glance of a mention in this proposal. The "Exit" section at the end concludes that removal of Type II would only be cancelled if it proved web-incompatible to remove, leaving no apparent room for preserving it based on its own merits (despite its cost).
I concur with the proposal that Type II is strongly aligned with developer intuition, and I also (as mentioned another previous thread) have built code that relies on it. Thus, its merits are at least that it's what a lot of developers expect, and it's an assumption developers use when they build code. Whether it's web-compatible to remove or not, it should at least have a vigorous debate and defense rather than concluding that it's already a given that it should be removed if possible.
Discarding Type II (taking that away) after 6 years because it tends to make implementations hard to secure and/or optimize is a much more drastic ask than removing the Type III and Type IV variants, which (by comparison, in my estimation) are far less common or intuitively expected.
I think this proposal should perhaps be split into two: one for removal of Type III and IV, which I wouldn't argue against, and a separate one for the removal of Type II, which I think should be more contested.