Open hiqsociety opened 2 years ago
Hi, it is very depend question. What kind of data we add. How many data we add. Have we delete scenarios in the middle of insertion process and etc.
In general we have next picture for memory: from overhead in 2-3 times to compression for some kind of keys. https://raw.githubusercontent.com/Bazist/HArray/master/Images/functionality2.png
I didn't compare it with hat-trie, but I think hat-trie will be more effective for string keys in compression.
HArray is designed for binary data, as balanced structure between Insert/Delete/Update/Memory usage for large datasets (up to billions keys in one container). In compare to hat-trie it doesn't have any hash function, so we have set of analytical queries that is useful in database scenarios.
Here is list of advantages:
any memory usage compared with the rest?
how does it compare with https://github.com/Tessil/hat-trie ?