Open dead-claudia opened 3 years ago
Yeah, this is a pain point very much.
Note the original code #{...rec, a.b[c].d: a.b[c].d + 1}
already have the bug, the right code might be:
#{...rec, a.b[c].d: rec.a.b[c].d + 1}
Such bug actually shows the defect of current syntax --- It seems current syntax have no potential to extend to solve the pain point.
Nor do objects. Why would we want to solve this piecemeal?
@ljharb I don't think anyone said or implied anything about how it'd be solved. The proposal itself is unnecessarily leading in that direction, I agree, but I'm not sure this particular issue is a symptom of that.
@isiahmeadows the proposal's name is "for record", so i think it's reasonable to assume that's all it's trying to solve.
Nor do objects. Why would we want to solve this piecemeal?
As I understand, the general purpose of the proposal is to solve the ergonomic problem of updating tuples/records, deep nest is the problem, so introduce deep path, but repeat of a long path is also a ergonomic problem, and as this issue shows, confusion of path a.b.c
with expression a.b.c
would also be a serious ergonomic problem and tend to cause bug.
Those problem of course also apply to objects.
Though there is a difference. IMO, object literal in many cases do not need updating style like o = {...o, x: o.x+1}
, instead we just use:
o.x++
foo({...o}) // or deepClone(o) if nest levels
PS. This also make me think maybe the best solution of ergonomic problem of the updating is just allow rec.x++
directly.
let rec1 = #{a:1, b:2}
const rec2 = rec1
assert(rec1 === rec2)
rec1.a = 2
++rec1.b
assert(rec1 !== rec2)
assert(rec1 === #{a: 2, b: 3})
rec2.a++ // throw, because rec2 is const
Allow rec.x++
could seems weird on first glance but it's just how most other mainstream programming languages deal with value types.
The key point is: Assignment (or passing to other functions) would always use copy semantic for value types.
I found that there are two concept model which may help to understand mutating a record/tuple directly.
Don't think record as immutable objects. They are values which hold by variables, when mutate any part of a record, the whole value (and all values in the path) have been changed. So the variable need to be let
(or var
) to allow change.
Still think record as immutable object , and just treat rec.a.b++
as syntax sugar of
(() => {
const _temp = rec.a.b
rec = #{...rec, a: {...rec.a, b: _temp + 1}}
return _temp
})();
Note the real transformation in transpilers like babel would be complex because it need to check in which level it's the start of the record/tuple.
Nor do objects. Why would we want to solve this piecemeal?
As I understand, the general purpose of the proposal is to solve the ergonomic problem of updating tuples/records, deep nest is the problem, so introduce deep path, but repeat of a long path is also a ergonomic problem, and as this issue shows, confusion of path
a.b.c
with expressiona.b.c
would also be a serious ergonomic problem and tend to cause bug.Those problem of course also apply to objects.
Though there is a difference. IMO, object literal in many cases do not need updating style like
o = {...o, x: o.x+1}
, instead we just use:o.x++ return {...o} // or deepClone(o) if nest levels
JS lacks such a hypothetical deepClone
API, though. Also, the copy has to happen before the update, not after (your example is wrong), and there could very well be mutable class instances within the list of properties that you probably don't want to clone blindly. (Yes, structured clone could address maps and sets, but it won't address custom class instances.) So there still is a use case for it even despite that, as it's not doing a full clone, only enough of one to update the inner value.
@isiahmeadows I wrote the code example to illustrate the different style , I'm sorry it could make confusion. What I try to describe is, instead of avoid mutation, another (maybe much common) style is just mutate and do copy when needed. return
seems bad case, so I changed it to foo(o)
. The full comparison is
// immutable object style
let o = immutable({x:1})
foo1(o)
o = immutable({...o, x: o.x+1})
foo2(o)
vs
// mutable style
let o = {x:1}
foo1(copy(o))
o.x++
foo(copy(o))
What I feel is, currently record/tuple is more like the syntax sugar for immutable object style,
let o = #{x:1}
foo1(o)
o = #{...o, x: o.x+1}
foo2(o)
But if consider it from the (mutable) value type, we just get
let o = #{x:1}
foo1(o) // o is value type so always copy
o.x++
foo(o) // o is value type so always copy
JS lacks such a hypothetical
deepClone
API ... that you probably don't want to clone blindly ...
Yeah, this is why I only talk about object literal. It's trivial to write deepClone for object literals (simple objects), but impossible to write deepClone for complex objects. Actually it's also impossible to write immutable(o)
for complex objects.
🤔 what about let's just built-in immer.js?
Nor do objects. Why would we want to solve this piecemeal?
As I understand, the general purpose of the proposal is to solve the ergonomic problem of updating tuples/records, deep nest is the problem, so introduce deep path, but repeat of a long path is also a ergonomic problem, and as this issue shows, confusion of path
a.b.c
with expressiona.b.c
would also be a serious ergonomic problem and tend to cause bug.Those problem of course also apply to objects.
Though there is a difference. IMO, object literal in many cases do not need updating style like
o = {...o, x: o.x+1}
, instead we just use:o.x++ foo({...o}) // or deepClone(o) if nest levels
PS. This also make me think maybe the best solution of ergonomic problem of the updating is just allow
rec.x++
directly.let rec1 = #{a:1, b:2} const rec2 = rec1 assert(rec1 === rec2) rec1.a = 2 ++rec1.b assert(rec1 !== rec2) assert(rec1 === #{a: 2, b: 3}) rec2.a++ // throw, because rec2 is const
Allow
rec.x++
could seems weird on first glance but it's just how most other mainstream programming languages deal with value types.The key point is: Assignment (or passing to other functions) would always use copy semantic for value types.
I found that there are two concept model which may help to understand mutating a record/tuple directly.
- Swift way (also the way of languages which use value type semantics by default, eg. C++)
Don't think record as immutable objects. They are values which hold by variables, when mutate any part of a record, the whole value (and all values in the path) have been changed. So the variable need to be
let
(orvar
) to allow change.
- Kotlin way
Still think record as immutable object , and just treat
rec.a.b++
as syntax sugar of(() => { const _temp = rec.a.b rec = #{...rec, a: {...rec.a, b: _temp + 1}} return _temp })();
Note the real transformation in transpilers like babel would be complex because it need to check in which level it's the start of the record/tuple.
let a = 1;
let b = a;
assert(a == b);
a = 2;
assert(a != b); // a == 2, b == 1
let a = #{ c: 1 };
let b = a;
assert(a == b);
a.c = 2;
assert(a != b); // a == #{ c: 2 }, b == #{ c: 1 }
const a = 1;
a = 2; // error
const a = #{ c: 1 };
a.c = 2; // error
It is certainly interesting to consider, assignment to a record or tuple would be sugar for creating a new R/T and assigning it to the same binding.
function foo(a) {
let b = a;
a.c++;
assert(a == b); // what will happen to this if we pass a record in? It seems that we need a new keyword to mark record/tuple in TypeScript.
}
@magic-akari Good question. And I had to admit this is the dark side of allowing mutating tuple/records directly. You can not tell whether a == b
, because the semantic of both ===
and a.c++
already changed (for records). It's hard to say how bad it is. And a.c++
maybe not the worst. The worst part maybe the path mix of objects and records. eg. a.b.c.d++
, the change will "propagate" to the topmost object property (or local binding). C++ have two operators (->
for deref member access), which make code look like a->b.c.d
so we clearly know the boundary of value and reference, but we have to stick on .
for exchangeability of objects/records.
Personally I think the weirdness is coming from the compound value type (which js never have before) and unavoidable if we also want real ergonomic of updating. TS and IDE could help to mitigate the problem, for example, with the type info, IDE could display the code like a._b.c.d_
(underscored b.c.d
part) so we could know the path of record.
It is certainly interesting to consider, assignment to a record or tuple would be sugar for creating a new R/T and assigning it to the same binding.
That is far too implicit for my liking. It doesn't intuitively look like it should change even though it does. And ===
to me represents effective identity, not mere equality.
That is far too implicit for my liking. It doesn't intuitively look like it should change even though it does. And === to me represents effective identity, not mere equality.
Yeah, it's too implicit. And ===
already mean equality not identity for tuple&record. Value types do not have identity. So the implicitness and the "equality instead of identity" is the consequence of introducing compound value type. If we really unlike it , we'd better change tuple/record to immutable objects not value type (though I don't think it really solve the problem) 😂
@hax I used "effective identity" to help clarify I wasn't referring to, say, machine address or anything like that. I meant it more like the mathematical sense, where identity is typically determined by structure. Hope that helps!
@isiahmeadows It seems your "effective identity" is just my "identity". But I'm not sure what's the difference between your "identity" and "equality"?
Identity in JS is not determined by contents, unless you have a “value type” - where my mental model is, every 3 is the same instance of 3.
As I mention in https://github.com/tc39/proposal-record-tuple/issues/372#issuecomment-1494334597, I don't think we should overload the reference assignment operation depending on the type of the value behind the base reference.
However an explicit new syntax which makes clear the base ref is updated when assigning to the reference may make sense.
Not a proposal for specific syntax, but something along the lines of
const foo1 = #{ bar: #{ baz: 123 } };
let foo2 = foo1;
foo2->bar.baz = 456;
assert(foo1.bar.baz === 123);
assert(foo1 !=== foo2);
Should nested updates like
#{...rec, a.b[c].d: a.b[c].d + 1}
have a special sugar syntax to remove the duplication? While the above doesn't seem like much with single-letter names, longer variable names can make it get very unwieldy very quickly.