I have a large number of key/value pairs, say twenty million, to
insert into the dense_hash_map. I noticed that there are several
times of re-allocation happens in the process of inserting, and
the peak value of memory allocation is double than required.
In order to reduce the total times of re-allocation and gain more
efficiency, I'm considering about preallocation.
So I wrote:
dense_hash_map<k, v> m;
m.resize(20000000);
m.set_empty_key(0);
m.set_deleted_key(-1);
to do the preallocation, and after set_empty_key, I saw a big
700MBytes allocation in the memory, so far so good.
But, after first few(say 4 to 5) insert/erase calls, the big memory
block will be freed, and dense_hash_map repeats the re-allocation
until 1.2GBytes (peak value is 2.3GBytes).
What should i do please ?
Original issue reported on code.google.com by AmanJI...@gmail.com on 25 Jun 2014 at 5:43
Original issue reported on code.google.com by
AmanJI...@gmail.com
on 25 Jun 2014 at 5:43