I have a 3D dataset (4000, 3000, 100) that I'm rendering with fury. The size of the dataset is about 1.2 GB.
When I call the actor.slicer function as: image_actor_z = actor.slicer(volume, affine), the memory usage after the call increases by approximately 6 GB.
Is this expected? I'm trying to use fury to eventually render datasets of size 50-60 GB on a workstation with 128 GB of RAM, so would this be possible? It seems not if the memory usage scales linearly with dataset size.
I tried debugging through the actor.slicer function and found that memory usage changed significantly at the following lines in actor.py. Any help would be appreciated.
Line 89: Increased by 1.2 GB vol = data
Line 91: Decreased by 1.2 GB im = ImageData()
Line 110: Increased by 1.1 GB vol = np.ascontiguousarray(vol)
Line 138: Increased by 1.1 GB image_resliced.AutoCropOn()
Line 256: Increased by 3.5 GB plane_colors.Update()
Hi,
I have a 3D dataset (4000, 3000, 100) that I'm rendering with fury. The size of the dataset is about 1.2 GB.
When I call the actor.slicer function as:
image_actor_z = actor.slicer(volume, affine)
, the memory usage after the call increases by approximately 6 GB.Is this expected? I'm trying to use fury to eventually render datasets of size 50-60 GB on a workstation with 128 GB of RAM, so would this be possible? It seems not if the memory usage scales linearly with dataset size.
I tried debugging through the actor.slicer function and found that memory usage changed significantly at the following lines in actor.py. Any help would be appreciated.
Line 89: Increased by 1.2 GB
vol = data
Line 91: Decreased by 1.2 GBim = ImageData()
Line 110: Increased by 1.1 GBvol = np.ascontiguousarray(vol)
Line 138: Increased by 1.1 GBimage_resliced.AutoCropOn()
Line 256: Increased by 3.5 GBplane_colors.Update()
Thanks, Chaitanya