Closed Paul-Lazunko closed 3 years ago
That's because [] !== []
. A JavaScript object value doesn't store the object itself, but a reference (pointer) to it, so if the references are different, the arrays are different and unique.
It is clear. But it violates the concept of Set itself.
How so?
Not saying this is the definition, but if we take this one:
Set objects are collections of values. You can iterate through the elements of a set in insertion order. A value in the Set may only occur once; it is unique in the Set's collection
Then it isn't violating the concept of Set since the values are unique, you are passing in javascript references to objects by design and []
an object. Those things are different no matter how you look at them, so the set should contain both.
What would happen if it wasn't true we wrote instead?
const a = [];
const b = [];
const c = new Set([a, b]);
a.push(2);
b.push(1);
// c = [ [1] ]; or c = [ [2] ]; Which would expect? or do you expect c = [ [] ];?
And if we move the definition of c
lower?
const a = [];
const b = [];
a.push(2);
b.push(1);
const c = new Set([a, b]);
// c = [ [1], [2] ];
Why would moving the point in which the set is created have an impact on the references saved.
The current functionality makes way more sense.
This would be true if new Set kept a reference to the original array and handles it changes, but it is not so. You want to get a set of unique values ​​here and now but not for the future with taking into account their possible changes. Imagine f.e. that you have an array of arrays and each of them contains longitude and latitude of some point at the map (two floats) - and you can't create array with unique values with Set in this case.
and handles it changes, but it is not so.
Can you explain this?
That would seem to contradict your statement.
And second, just because it can't be used to find the unique points on a x/y coordinate plan in YOUR representation has nothing to do with whether or not the functionality is expected.
for instance, if you just change the initial setup it works fine. Instead of what you said, just instead say this:
Imagine f.e. that you have an array of strings which contain longitude and latitude of some point at the map
Then you CAN create an Set which removes duplicates.
warren@palladium:~$ node
> const a = 'x=0;y=0';
undefined
> const b = 'x=0;y=0';
undefined
> new Set([a, b]);
Set { 'x=0;y=0' }
>
1) The Set has references to the elements of the array on the basis of which it was created, but it does not track changes in the array itself:
const a = [];
const b = [];
const c = [];
const data = [a,b];
const set = new Set(data);
console.log({ set, data });
data.push(c);
console.log({ set, data });
2) You persistently don't want to notice that the Set does not work properly with composite data types, at the least, it's behavior is unexpected in this case. It wasn't not about solving this problem in principle — it is easily solved. It's that you cannot solve it in this way without resorting to additional transformations of the source data
const a = [...new Set([ [ ], [ ] ]) ] // a = [ [ ], [ ] ];