efficient / libcuckoo

A high-performance, concurrent hash table
Other
1.62k stars 275 forks source link

cannot deal with a large size hashtable #45

Closed ShelsonCao closed 8 years ago

ShelsonCao commented 8 years ago

Hi, My job needs to build a large size hashtable (size>1 billion), then I modify Line 20 as "const size_t total_inserts = 1000000000;" in examples/count_freq.cc, and it reports "std::bad_alloc". So Does it support such a concurrent hashtable in count frequency task? Looking forward to hearing from you. Regards, Shelson

manugoyal commented 8 years ago

Usually that error would occur if your machine ran out of memory when trying to resize the table. 1 billion items is a lot, so I would check that your machine has enough memory in the first place to hold that many elements. Also remember that the hash table has some memory overhead, so if you actually had 1 billion elements, each taking up maybe 8 bytes, the hashtable would take on the order of 8GB of memory.

But in theory, with enough memory, the hash table should be able to support such large workloads.