Open justmars opened 4 years ago
I've run into a similar problem with Google Cloud Run: beyond a certain size of database file I find myself needing to run instances there with more RAM assigned to them.
I haven't yet figured out a method to estimate the amount of RAM that will be needed to successfully serve a database file of a specific size- I've been using trial and error.
5GB is quite a big database file, so it doesn't surprise me that it may need a bigger instance. I recommend trying it on a 1GB or 2GB of RAM Digital Ocean instance (their default is 512MB) and see if that works.
Let me know what you find out!
I think I tried the same db size on the following scenarios in Digital Ocean:
All such attempts conked out with "out of memory" errors
I wonder if setting a soft memory limit within Datasette would help here: https://www.sqlite.org/malloc.html#_setting_memory_usage_limits
If attempts are made to allocate more memory than specified by the soft heap limit, then SQLite will first attempt to free cache memory before continuing with the allocation request.
https://www.sqlite.org/pragma.html#pragma_soft_heap_limit
PRAGMA soft_heap_limit PRAGMA soft_heap_limit=N
This pragma invokes the sqlite3_soft_heap_limit64() interface with the argument N, if N is specified and is a non-negative integer. The soft_heap_limit pragma always returns the same integer that would be returned by the sqlite3_soft_heap_limit64(-1) C-language function.