Closed chourroutm closed 1 month ago
Hi! This will depend on your chunk size and data type. I suspect something is set a bit oddly, as usually 3.5 GB is enough for even very large data sets.
However, you can make individual tasks larger by setting memory_target=int(300e9)
(300 GB for example)
Thanks, it did solve the issue!
Hi, Following on https://github.com/seung-lab/cloud-volume/issues/635, I've now switched to a Linux workstation with 1TiB RAM but I can't compute downscaled versions of a dataset with this > 3.5GiB error:
This is the output of
free -h
:I've probably made a mistake on the line
tq = LocalTaskQueue(parallel=True)
?