Closed tonyalaribe closed 7 years ago
Hi Tony,
How does buntdb perform on data too large to fit into ram? would i have good performance with a dataset of over 5gb (mongodb dump), on a 500mb or 1gig ram server?
Assuming that the server has virtual memory, it'll start paging to disk when the all the ram is used up. This could cause the Go app to run very slowly, between 10-100x slower that reading directly from ram, depending on disk speed. If the server is not using virtual memory then the Go program will panic with an out of memory error.
I don't recommend that you do this. If memory is an issue then use an on-disk database like BoltDB or leveldb. For BuntDB you should have 1/4 more memory on the server than the working in-memory dataset to be safe. I suggest loading the data into a test buntdb program and see how much mem it uses.
Can I query with an array of keys? or get each key from the array individually?
You'll need to query each key individually.
Thanks Josh. You helped me clear my mind about what to use. And when buntdb is a good choice.
You're welcome and best luck.
How does buntdb perform on data too large to fit into ram? would i have good performance with a dataset of over 5gb (mongodb dump), on a 500mb or 1gig ram server?
I'm currently evaluating buntdb over boltdb. As replacement for a system that currently runs on mongodb. I intend to use bleve search as well for indexing and search. Can I query with an array of keys? or get each key from the array individually?