Closed FNTwin closed 2 months ago
Yes, I just ran into this as well last night while debugging some other issues. It turns out that SQLITE in-memory databases have a limit (1GB? https://sqlite.org/forum/forumpost/d2e3a2d862).
For the moment, you can work around this at the moment by manually specifying a path to a directory when you create the client: PortalClient(..., cache_dir="some/directory")
. Just make sure there is enough disk space wherever you set it.
I am thinking of a better solution, but that should work. Let me know how it goes!
Thank you so much @bennybp for the quick workaround.
During my efforts on overcoming this problem I also dig a bit around the sqlite forum and that's mostly why I decided to override the TMPDIR/SQLITE_TMPDIR
folder but maybe qcportal is caching files in a different way.
As it is now, I'm able to proceed forward with reading all the records.
Again, thank you a lot for the help!
While reading the records of a computed dataset, I'm getting a
sqlite3.OperationalError: database or disk is full
. I'm pretty sure there is enough disk space in the cluster I'm using and I tried to change bothTMPDIR
andSQLITE_TMPDIR
variable to point away from the normal tmp folder just to be sure but I keep getting the error.More importantly , the dataset entries I'm trying to read are subsets of the entire dataset as I decomposed the big dataset in multiple parts of 200k entries each (dataset_1, dataset_2, ...) and this bug seems to appear only when reading from one of those subsets.
Traceback of the sqlite3 error:
Do you know what could cause this issue?