Open ghost opened 1 year ago
Thank you for your finding and the minimal example! What do you mean by "Java implementation is right"? Do you suggest to use the hash algorithm used by Java's LinkedHashMap?
I mean Java native hashCode implementation for HashMap returns different hashcodes for that example. The problem I faced is that I had a grid modeled with a linkedhashmap. The grid had always the same elements BUT in different positions. The current implementation returns always the same hashCode no matter the positions of the elements. Since I was storing the grids in a Set and all of them had the same hashcode, the performance was really bad...
I made a stab at this, but I'm not sure if its the right way to "fix it".
@jarlah You can not change hashCode without changing equals. The result of these two methods must agree. The contract for these two methods is defined in java.lang.Object. So, always change them both, and you are good.
I believe, there is a contradiction between the API of the interface Traversable and the classes that implement it and their unit tests. (?)
Interface Traversable states that hashCode and equals is different for collections with predictable iteration order and for collections with arbitrary iteration order. Traversable.hashCode Traversable.equals
However, TreeMap and LinkedHashMap implement hashCode and equals for unordered collections: TreeMap.hashCode TreeMap.equals LinkedHashMap.hashCode LinkedHashMap.equals
Maybe I am just confused about the Javadoc in Traversable?
TreeMap.isOrdered() returns true, TreeMap.isSequential() returns false. I believe this makes TreeMap a collection with predictable iteration sequence.
LinkedHashMap.isOrdered() returns false, LinkedHashMap.isSequential() returns true. I believe this makes LinkedHashMap a collection with predictable iteration sequence.
@jarlah You can not change hashCode without changing equals. The result of these two methods must agree. The contract for these two methods is defined in java.lang.Object. So, always change them both, and you are good.
Yes, of course. What i did was just to make a PR hightlighting how easy it was to change the behaviour. But it totally is not ready.
@jarlah What do you think about the Javadoc in Traversable.hashCode and Traversable.equals? Is my interpretation that there is a contradiction correct? Or am I misinterpreting it?
I think vavr is making itself a disservice by redifining such concepts. But i guess its ok, because vavrs collections will only be used and compapred to with vavr collections.
I see your point @wrandelshofer that the doc talks about predictable iteration order and arbitrary iteration order. But i cannot say what it means. Or if you are right.
What i can say however is that to fix my equals in my PR i would need to sort both collections and compare the two sorted collections. I dont think thats very effective, and im worried about the performance impliciations.
Yes, I can not tell whether it is correct or incorrect. I would err on the side of the implementation, and assume that the hashValue/equals depends on the collection types Set/Seq/Map.
I - personally - would not go in the direction of your fix. It is convenient to be able to check sets and maps for equality/hashCode regardless of their iteration order. But I am not the designer. So, it is a viable design direction, of course.
I think vavr is making itself a disservice by redifining such concepts. But i guess its ok, because vavrs collections will only be used and compapred to with vavr collections.
For me - ideally vavr Collections can be swapped in and out with java.util Collections. The only difference being that the vavr Collections have persistent mutability, and have no API methods that can throw UnsupportedOperationException.
I understand that the hash is computed via the underlying state, but - to me - the confusion arises in how those two hash maps - {"a": 1, "b": 2} and {"a": 2, "b": 1} - have the same underlying state? Wouldn't the keys and the values be associated with each other, regardless of any ordering concerns? In that case, why wouldn't the hash code calculation take the key-value relation into consideration?
Playing with this a little because I'm bored and stalling on my "real" work, I can confirm that the only thing that seems to matter for the hash code is that all the keys and values are included - ordering of the keys doesn't matter. So {"b": 1, "a": 2} would have the same hash code as the other examples. This also isn't an edge case for a 2-pair map; I wasn't expecting that it was, but I did it with 3 pairs with the same results.
It also doesn't appear to be a fluke where those permutations just happen to generate the same hash code - which would be unlikely, but there's no reason it would be impossible. It's acceptable - and even expected - that random states for a given object will have the same hash code, even if they're not equal; the important thing is that they have the same hash code when they are equal. So you could always return 1 as your hash code and you'd meet the contract requirements. You'd just have horrible performance if you used it as a key in a hash map. (And there's probably other places that use the hash code, but that's the obvious one at the moment.)
From @wrandelshofer:
@jarlah You can not change hashCode without changing equals. The result of these two methods must agree. The contract for these two methods is defined in java.lang.Object. So, always change them both, and you are good.
It's a little more subtle than this; in this particular case, where it appears that hashCode
has an incorrect implementation but equals
is correct, then by all means update hashCode
and leave equals
alone. The important thing is - like you said - they must agree, meaning that they're based on the same internal state. Since equals
appears to take key-value relations into account, but hashCode
doesn't, I'm going to say they're not based on the same state.
As an aside, while equals
takes key-value pairs into consideration, it doesn't seem to take the order of those pairs into consideration, which I would expect from a "linked" collection. So {"a": 1, "b": 2} and {"b": 2, "a": 1} are considered equal, although they shouldn't be. (Given the earlier comments regarding isSequential
and isOrdered
, I believe the expected result should be that they're not equal.)
I have an idea how to approach this, PoC: https://github.com/vavr-io/vavr/pull/2803
I looked at your Pull Request #2803.
Shouldn't this fix include a change in LinkedHashMap.hashCode()?
Because in this method we currently have:
return Collections.hashUnordered(this);
But this should be
return Collections.hashOrdered(this);
Here's the explanation: https://github.com/vavr-io/vavr/pull/2803#issuecomment-2308450430
I see your explanation. The explanation is fine.
However, LinkedHashMap extends from Traversable. And Traversable defines the contract of the hashCode() Method.
Maybe, in your pull request, you should change the contract in Traversable. Because LinkedHashMap does not fulfill the contract.
Ok, that's a great point - I did not realize that Traversable
contract defined hashing rules. In this case, this is an easy choice - we need to obey the contract. Thanks!
Or, alternatively, move the contract for equals/hashCode out of Traversable into more specific interfaces.
In the Java JDK API, the Iterable and Collection interfaces do not define equals/hashCode. This happens only in more specific interfaces: in the List, Map, Set interfaces.
We'll probably need to do both:
Collections.hashOrdered
Well, it's slightly more complicated since ordered hashcode breaks equals/hashcode contracts. Yes, this could be fixed by adjusting the equals()
implementation, but this is a relatively big library that is used by many people who would not appreciate such a change even in a major version. I need more data to make an informed decision. Stay tuned
I just noticed that LinkedHashSet also uses Collections.hashUnordered()
.
https://github.com/vavr-io/vavr/blob/b1ce6080776758de8daf4ab2abdc88e4754235b8/src/main/java/io/vavr/collection/LinkedHashSet.java#L966-L968
Maybe the most sensible thing would be to change the Javadoc in Traversable
.
System.out.println(LinkedHashMap.of("a",1, "b",2).hashCode());
System.out.println(LinkedHashMap.of("a",2, "b",1 ).hashCode());
Both maps have the same hashCode and this is wrong. I found this bug while participating in Advent of Code. An algorithm was extremely slow because of that. Java implementations is right