welch-lab / liger

R package for integrating and analyzing multiple single-cell datasets
GNU General Public License v3.0
389 stars 78 forks source link

Error in subsetLiger and/or seuratToLiger #176

Closed samuel-marsh closed 4 years ago

samuel-marsh commented 4 years ago

Hi All,

So just ran into another error that I've never seen before. I've used subsetLiger without any issues many times but recently when trying to subset a particular cluster from larger Liger object ran into these error messages. I believe that it might be an issue with the underlying data for this particular cluster but don't really understand where that error is occurring. For reference this is liger object processed with online branch but stored in memory (~80K nuclei).

When I try and subset the particular cluster of interest I get this error:

obj_liger <- subsetLiger(object = obj_liger, clusters.use = 19)
Error in base::colSums(x, na.rm = na.rm, dims = dims, ...) : 
  'x' must be an array of at least two dimensions

The same error also occurs if I try to subset using cell barcodes instead of cluster identity.

I wondered if this was some weird error following the processing of the data through Liger so I then tried importing the data into Seurat V3 and then subsetting based on barcodes. This works and produces Seurat object but then when I try to convert to liger I get this error:

obj_liger <- seuratToLiger(obj, combined.seurat = T, meta.var = "orig.ident")
Error in base::colSums(x, na.rm = na.rm, dims = dims, ...) : 
  'x' must be an array of at least two dimensions

If I try specifying remove.missing = F and I get a different error:

obj_liger <- seuratToLiger(obj, combined.seurat = T, meta.var = "orig.ident", remove.missing = FALSE)
Error in `.rowNamesDF<-`(x, value = value) : invalid 'row.names' length

I can confirm that every nuclei in this subset has >200 UMI and >100 genes per nucleus.

Any thoughts or help would be greatly appreciated!

Best, Sam

samuel-marsh commented 4 years ago

This error is fixed with mew code listed in #180. Closing this issue.