jorensorilla / sparsehash

Automatically exported from code.google.com/p/sparsehash
0 stars 0 forks source link

dense_hash_map fails for large data sets #47

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?
1. populate a dense_hash_map<const char*, const char*> with 30,000 entries
2. the key is a 21 character long(e.g "ODBC456789.12/45-BA91")
3.
4.

What is the expected output? What do you see instead?
Calling map.find on a key I know was inserted should return the value.
instead find returns map.end

What version of the product are you using? On what operating system?
sparsehash-1.5.2 (dense_hash_map) on MSVC 7.1.6030

Please provide any additional information below.

Original issue reported on code.google.com by rags.iye...@gmail.com on 11 Dec 2009 at 9:26

GoogleCodeExporter commented 9 years ago
Also, map.size() seems to have a ceiling of ~844.
I tried resize(100000) at startup but this only affects the load factor and has 
no 
bearing on the size.

Is there a way of increasing the number of buckets?
I'm only concerned with rapid lookups; don't really care for memory

Original comment by rags.iye...@gmail.com on 11 Dec 2009 at 9:36

GoogleCodeExporter commented 9 years ago
You need to specify a hash function for char*.  Otherwise, you're hashing 
pointers, 
which will not work as you want.  See the first example in
   http://google-sparsehash.googlecode.com/svn/trunk/doc/dense_hash_map.html

} I tried resize(100000) at startup but this only affects the load factor and 
has no 
} bearing on the size.

That's not correct.  It actually does affect the size, but not the load factor.

You can also try using the one-arg constructor where you pass in the size, 
rather 
than having to call resize after creating the object.

Original comment by csilv...@gmail.com on 11 Dec 2009 at 9:58