Closed adamreichold closed 2 years ago
As an example, storing all of govdata, doris-bfs and stadt-leipzig takes about 250 MB (which already rounds each dataset up to page size). The index with compression is 15 MB. The index without compression is 16 MB. Typical cloud machines will have tens of GB as memory.
This is a counter-point to #16 which instead of applying it to the datasets as well, disables it for the index. This is reasonable insofar both our index and our datasets currently fit comfortably into the main memory of our server which means we are not really avoid additional IO operations by compression and needlessly decompress page cache contents instead.
Fixes #3