Open newbieMars opened 2 weeks ago
While the function simulation_training_ZINB
is running, different kind of temporary files are generated:
Among these file, the csv one, after 24 hours of calculation, has a size of 231 GB and continues to grow. Could this be the cause of the error message? Is the size of my space object count the problem? Should I down-sample it ?
Hi,
I apologize for the delay in getting back to you. I have been traveling in the past few weeks.
The size of the data you have is not huge. What seems fishy is the large CSV file. I don't remember seeing large CSV files when running simulation_training_ZINB.
There are a few things we can check:
Hello,
Thank you again for your excellent package! I have encountered an issue when running the simulation_training_ZINB function, and I would like to bring it to your attention.
Context: I am executing the function with the following input dimensions:
sc_count
: 17409 x 2175sc_cluster
: 2175 x 2spatial_count
: 377 x 48521spatial_cluster
: 48521 x 2overlap_gene
: 283unique_cluster_label
: 7The function is called as follow:
Output and Error: The function begins running, and the following messages are displayed during execution:
However, shortly after this, the following error occurs:
Issue: It seems that the function is running out of memory when trying to allocate 2048 Mb. I suspect this might be related to the large dataset size or the number of iterations specified. I would appreciate any guidance on whether this is an expected limitation, or if there are steps I could take to resolve this issue (e.g., adjusting parameters or environment settings). I have already tried to launch my script with unlimited memory in a HPC (using a singularity container) without success:
Could it be linked to temporary file writing by rstan pavkage? By the way, I'm able to execute the tutorial without any error on my laptop (with a docker container), the problem occurs on my data with more iteration and cores.
Environment:
Matrix products: default
Attached packages:
Thank you very much for your help, and please let me know if you need any additional information!