OpenChemistry / tomviz

Cross platform, open source application for the processing, visualization, and analysis of 3D tomography data
https://tomviz.org/
BSD 3-Clause "New" or "Revised" License
324 stars 86 forks source link

Downsample x2 not efficient. Memory fills up for 1GB datasets #405

Closed Hovden closed 8 years ago

Hovden commented 8 years ago

We need a more efficient Python approach to downsampling by 2.

Solution 1: Change python code from order 4 spline to order 3 (bilinear) and turn prefilter off. result = scipy.ndimage.interpolation.zoom(array, (0.5, 0.5, 0.5),output=None, order=1,mode='constant', cval=0.0, prefilter=False)

Solution 2: FFT image and throw away half of data at high-frequencies.

Problem: When I load a large 1GB dataset (e.g. "LARGE_PtCu_NanoParticles.tif" also known as tomo_5 recon) the dataset takes up 2GB of system memory (probably 16bit to 32bit casting which is ok). However, when I run a python transform to downsize by a factor of 2, tomviz uses 10GB of memory.

If you have a volume render in place, it takes up an additional 5GB of memory and I run out of memory.

I think it is OK to have large memory requirements (>32GB) when users want work with large datasets, but it is worrisome when downsampling requires 5x the memory of the original dataset.

Hovden commented 8 years ago

Changed the spline interpolation to order 1. It is faster and uses less memory. 1024^3 datasets won't crash on mac laptop when downsampling.