YangLabHKUST / SpatialScope

A unified approach for integrating spatial and single-cell transcriptomics data by leveraging deep generative models
https://spatialscope-tutorial.readthedocs.io/en/latest/
GNU General Public License v3.0
42 stars 4 forks source link

Memory requirement for Nuclei Segmentation step #2

Open PietroAndrei opened 6 months ago

PietroAndrei commented 6 months ago

Hi,

I'm trying to run the Nuclei Segmentation step on a Visium image with shape (9847, 10090), but I receive the following error message:


terminate called after throwing an instance of 'std::bad_alloc'

I am currently using 60GB to run the script. In your experience, which is the ideal amount of memory to run this step? I think the image that I am using has basically the same size that you recommended in the documentation.

Thank you for your help!

Pietro

JiaShun-Xiao commented 6 months ago

Hi,

I'm trying to run the Nuclei Segmentation step on a Visium image with shape (9847, 10090), but I receive the following error message:

terminate called after throwing an instance of 'std::bad_alloc'

I am currently using 60GB to run the script. In your experience, which is the ideal amount of memory to run this step? I think the image that I am using has basically the same size that you recommended in the documentation.

Thank you for your help!

Pietro

Hi, from our experience, the ideal amount of memory for Visium image Nuceli segmentation is at least 100GB, because we require the original high-resolution histological image as input. By the way, in our tutorial for human heart Visium data, we used 83GB memory for Nuclei segmentation.

Jiashun