Sorry for a silly question here. So far I have only worked with a normal workstation for upto 8k cells.
Soon we are planning big experiments with 15-20 samples of 10k cells each (150k - 200k cells) (~18000 features)
Is 128GB RAM sufficient for the pipeline (SCT & Integration included). But seeing this thread @davemcg using 300GB RAM and still not able to work with 400k cell, I am not sure about what to buy.
We cannot advise on hardware, but if you read the rest of that thread, we provide an alternative approach for running extremely large datasets even in the absence of a high-memory machine
Originally posted by @davemcg in https://github.com/satijalab/seurat/issues/1720#issuecomment-516948860
Sorry for a silly question here. So far I have only worked with a normal workstation for upto 8k cells. Soon we are planning big experiments with 15-20 samples of 10k cells each (150k - 200k cells) (~18000 features) Is 128GB RAM sufficient for the pipeline (SCT & Integration included). But seeing this thread @davemcg using 300GB RAM and still not able to work with 400k cell, I am not sure about what to buy.