Closed xingzhis closed 1 year ago
Same issue in my end.
SessionInfo: R version 4.2.0 (2022-04-22 ucrt) -- "Vigorous Calisthenics" Copyright (C) 2022 The R Foundation for Statistical Computing Platform: x86_64-w64-mingw32/x64 (64-bit)
R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type 'license()' or 'licence()' for distribution details.
R is a collaborative project with many contributors. Type 'contributors()' for more information and 'citation()' on how to cite R or R packages in publications.
Type 'demo()' for some demos, 'help()' for on-line help, or 'help.start()' for an HTML browser interface to help. Type 'q()' to quit R.
[Workspace loaded from ~/.RData]
sessionInfo() R version 4.2.0 (2022-04-22 ucrt) Platform: x86_64-w64-mingw32/x64 (64-bit) Running under: Windows 10 x64 (build 22000)
Matrix products: default
locale: [1] LC_COLLATE=English_United States.utf8 LC_CTYPE=English_United States.utf8 LC_MONETARY=English_United States.utf8 [4] LC_NUMERIC=C LC_TIME=English_United States.utf8
attached base packages: [1] stats graphics grDevices utils datasets methods base
loaded via a namespace (and not attached): [1] compiler_4.2.0 fastmap_1.1.0 cli_3.3.0 htmltools_0.5.2 tools_4.2.0 rstudioapi_0.13 yaml_2.3.5 rmarkdown_2.14 [9] knitr_1.39 xfun_0.31 digest_0.6.29 rlang_1.0.4 evaluate_0.15
This can happen if you are running CCA on computers with low-medium amounts of memory. To analyze these datasets going forward we would recommend the use of the Seurat integration workflow (you can use the sketch-based integration vignettes if you are continuing to run into memory issues)
Hi,
I was running the Integrating scRNA-seq and scATAC-seq data demo with Seurat v3, and the program quits with a segmentation fault.
This is the line of code where the error occurs:
https://github.com/satijalab/seurat/blob/3bee84a8d710b7ee0022b929dab1df2f6ba26fdb/vignettes/atacseq_integration_vignette.Rmd#L128-L136 and this is the error message:
My session info:
The demo works with my own data, though.
Thank you!