seung-lab / igneous

Scalable Neuroglancer compatible Downsampling, Meshing, Skeletonizing, Contrast Normalization, Transfers and more.
GNU General Public License v3.0
44 stars 17 forks source link

Memory limit of 3.5 GiB on 965GiB RAM device #180

Closed chourroutm closed 1 month ago

chourroutm commented 1 month ago

Hi, Following on https://github.com/seung-lab/cloud-volume/issues/635, I've now switched to a Linux workstation with 1TiB RAM but I can't compute downscaled versions of a dataset with this > 3.5GiB error:


from taskqueue import LocalTaskQueue
import igneous.task_creation as tc
from pathlib import Path

output_dir = "output/mask.precomputed/"
output_dir = Path(output_dir)
mkdir(output_dir)

output_dir = output_dir.absolute().as_uri() + "/"

print(output_dir)

tq = LocalTaskQueue(parallel=True)
tasks = tc.create_downsampling_tasks(output_dir, mip=0, num_mips = 3, factor= (2,2,2), fill_missing=True, delete_black_uploads=True)
tq.insert(tasks)
tq.execute()
print("Done!")
file:///home/me/convert_with_cloudvolume/output/mask.precomputed/
WARNING: Memory limit (3500000000 bytes) too low to compute 3 mips at a time. 2 mips possible.
Volume Bounds:  Bbox([0, 0, 0],[1169, 1169, 1345], dtype=np.int32, unit='vx')
Selected ROI:   Bbox([0, 0, 0],[1169, 1169, 1345], dtype=np.int32, unit='vx')

This is the output of free -h:

               total        used        free      shared  buff/cache   available
Mem:           1.0Ti        26Gi       965Gi       1.0Gi        15Gi       974Gi
Swap:          476Gi          0B       476Gi

I've probably made a mistake on the line tq = LocalTaskQueue(parallel=True)?

william-silversmith commented 1 month ago

Hi! This will depend on your chunk size and data type. I suspect something is set a bit oddly, as usually 3.5 GB is enough for even very large data sets.

However, you can make individual tasks larger by setting memory_target=int(300e9) (300 GB for example)

chourroutm commented 1 month ago

Thanks, it did solve the issue!