symisc / unqlite

An Embedded NoSQL, Transactional Database Engine
https://unqlite.symisc.net
Other
2.11k stars 164 forks source link

Huge insert #50

Closed mayankgaur019 closed 6 years ago

mayankgaur019 commented 6 years ago

I am new to unqlite. I don't exact way to store million data in one go . when I am trying to insert million data its taking too much memory until, I am not calling "unqlite_close". Any help or sample code to insert millions data will be helpful for me

symisc commented 6 years ago

Yes, call unqlite_commit() periodically say in each 100000 insertions so that working buffers can be released.

mayankgaur019 commented 6 years ago

Thanks for replying . I am reading data from a file which contains millions of key, so do i need to maintain a counter for lines so once i reach a max value say 100000 , i will call unqlite_commit(). Am i correct?, if wrong please tell me exact way .

symisc commented 6 years ago

Yes, a simple loop counter should be fine. Once it reach the threshold value (i.e. 100000 in our case), commit everything to disk and reset back this counter to zero.

mayankgaur019 commented 6 years ago

I did the same thing. I have a file which has more than 10 lakhs of key. but it still taking approx 400mb of memory.

symisc commented 6 years ago

400MB seems correct for million keys unless you disable transactions, you should decrease memory usage