satijalab / seurat

R toolkit for single cell genomics
http://www.satijalab.org/seurat
Other
2.29k stars 915 forks source link

R Session Aborts When Running IntegrateData() #5330

Closed ts687 closed 2 years ago

ts687 commented 2 years ago

Hi,

I'm currently learning analysis of scRNA-seq data using Seurat and have encountered an issue when trying to integrate two 10X runs. For some background, each run consists of 8 samples and 6 experimental conditions across the 16 samples. Each run is contained within a filtered Seurat object which have been merged into one Seurat object. When plotting a UMAP of this merged object and grouping by run the clusters do not overlap well indicating the need for integration to remove the batch effect. Therefore, I split the merged object by sample number and normalised using SCTransform and followed the integration steps in the tutorial you provide. All steps run fine until the final IntegrateData() step which caused the R session to abort. I have updated RStudio and R but this hasn't made a difference. Do you have an suggestions on how to overcome this issue? The code I used is below:

split_seurat <- SplitObject(seurat_merged, split.by = "orig.ident")

for (i in 1:length(split_seurat)) { split_seurat[[i]] <- NormalizeData(split_seurat[[i]], verbose = TRUE) split_seurat[[i]] <- CellCycleScoring(split_seurat[[i]], g2m.features = cc.genes.updated.2019$g2m.genes, s.features = cc.genes.updated.2019$s.genes) split_seurat[[i]] <- SCTransform(split_seurat[[i]], vars.to.regress = "percent.mito", variable.features.n = 5000) }

integ_features <- SelectIntegrationFeatures(object.list = split_seurat, nfeatures = 5000)

split_seurat_prep <- PrepSCTIntegration(object.list = split_seurat, anchor.features = integ_features)

integ_anchors <- FindIntegrationAnchors(object.list = split_seurat_prep, normalization.method = "SCT", anchor.features = integ_features)

saveRDS(integ_anchors, "saved_variables/integration_anchors.Rdata")

seurat_integrated <- IntegrateData(anchorset = integ_anchors)

Thanks very much, any advice is much appreciated!

timoast commented 2 years ago

How much RAM is available and how big are the objects you're trying to integrate? Some tips for low-memory integration here: https://satijalab.org/seurat/articles/integration_large_datasets.html. You can also try IntegrateEmbeddings() rather than IntegrateData() to conserve memory.

no-response[bot] commented 2 years ago

This issue has been automatically closed because there has been no response to our request for more information from the original author. With only the information that is currently in the issue, we don't have enough information to take action. Please reach out if you have or find the answers we need so that we can investigate further.