Open awb99 opened 2 years ago
Konserve does not have a batch processing framework yet, e.g. map-reduce. Mostly it is used as a low-level storage layer interface for simple needs. Whenever you need to do selective but large update operations on its values or sophisticated queries it is beneficial to use Datahike or another index based data management solution, e.g. by rolling your own hitchhiker-tree or persistent sorted set index. What they basically do for you is to automatically balance the amount of read and write IO operations you have to do. In your case you might be able to put many more invoices than one into each konserve value and keep them in memory most of the time.
I tried to have this answered in the datahike slack chat, but nobody did know an answer. I want to store 5000 values, each with a specific key in the konserve store. From the documentation I am not able to determine how this works. Currently I call (save [:invoice n] invoice-n) 5000 times. And this takes 30 minutes or so. This is my save function:
How would I do it if I want to have konserve update it all in one go? Do I have to do this via core.async primitive, or do I have to call (apply k/assoc-in) with multipe key/value pairs?
Thanks a lot. In case there is documentation for this and I didn't find it, then I am sorry :-)