Open jychoi0616 opened 2 months ago
Hi Joy,
It is true that changing number of cpu does not help. Deconv requires a big amount of memory to read the tomogram and perform Fourier transform.
I recommend using binned tomograms for deconvolution. Or there is a chunk size parameters in deconv, which might help but it was benchmarked about 4 years ago. It may or may not work but I will try to look into that. It will break the tomograms in to smaller pieces and stitch together.
Best Yuntao
Dear IsoNet team,
Is there a file size limit that IsoNet could perform deconv function?
When I performed IsoNet using tomogram of pixel size 2.48 with a dimension of 409640961200, the function gets "killed" during the deconv function run. When I tested the same tomogram, but a cropped one with a dimension of 401401201 worked well. Also, the same tomogram but higher binning size (pixel size of 4.96) with a dimension of 20482048601 worked well.
I also tried to use various numbers of cpu (i.e. ncpu=3, 10, 15, 20, 30, 50, 60), but all didn't help for the full tomogram with pixel size of 2.48. Do you have a suggestion on this please?
Thank you so much, Joy