TuragaLab / DECODE

This is the official implementation of our publication "Deep learning enables fast and dense single-molecule localization with high accuracy" (Nature Methods)
GNU General Public License v3.0
89 stars 26 forks source link

Loading the TIFF files #150

Closed AnshulToshniwal closed 2 years ago

AnshulToshniwal commented 2 years ago

Hello, Before the model can generate the emitter set, the TIFF image stacks must be loaded in the python environment. The TIFF image stacks I am dealing are in total of 50 GB memory and hence unsuitable to be loaded simultaneously before localisation. Is there any way to generate the emitter se without loading the TIFF stack within the python environment?

ASpeiser commented 2 years ago

Here is a simple loop for inferring the emitter set from multiple files:

em_list = []

for p in frame_paths:
   frames = decode.utils.frames_io.load_tif(p)
   em_list.append(infer.forward(frames))

emitter = decode.EmitterSet.cat(em_list, step_frame_ix=100)

step_frame_ix is the number of frames in every single file.

AnshulToshniwal commented 2 years ago

Yes, I have a loop for this issue but the bottleneck is the loading the TIFF file which takes most time. Is there any way to make predictions on the TIFF file without loading it in the python environment?

Haydnspass commented 2 years ago

Dear @AnshulToshniwal, I am guessing you are not talking about multiple TIFF files but rather a single multi-page large TIFF file? We have prepared something which is however not yet thoroughly tested.

If you want to, try out using TIFF Tensor

frames = decode.utils.frame_io.TIFFTensor("path_to_your_large_tiff")

# ...

this will only put the portion of the tiff into memory that is actually needed. I haven't tried it for a while, but in principle it should work. Make sure to have the TIFF on a fast volume though (local SSD). Let me know ;)

AnshulToshniwal commented 2 years ago

Thanks for your replies. I had multiple TIFF files within the directory with each TIFF file containing 4247 frames and is of size 4.3 GB, I will make sure to load the TIFF in a SSD