Hi @bnprks, thanks in advance for continuing to maintain such a useful package!
I've been running into an issue with a bit of an edge case recently when trying to index an IterableMatrix with a list of 0 features. I do this, for example, when trying to get the percentage of cells that express mitochondrial features. Doing this with a sparse matrix would return a matrix with no rows, and although the IterableMatrix output seems to suggest that is what is happening, doing some downstream functions on the object shows this is not the case.
For an example:
I create a test IterableMatrix w 20 rows and 20 cells
> test
20 x 20 IterableMatrix object with class MatrixSubset
Row names: MIR1302-2HG, FAM138A ... RNF223
Col names: cell1, cell2 ... cell20
Data type: float
Storage order: column major
Queued Operations:
1. Load compressed matrix from directory ~/test
2. Select rows: 1, 2 ... 37 and cols: 1, 2 ... 20
Subsetting the matrix with 0 features seems to correctly give a 0 x 20 matrix. However, if I get the colSums of the matrix its greater than 0 and converting this subsetted matrix to a sparse matrix leads to a strange result.
> features <- grep("^MT-", rownames(test), value = T)
> features
character(0)
> test[features,]
0 x 20 IterableMatrix object with class MatrixSubset
Row names: unknown names
Col names: cell1, cell2 ... cell20
Data type: float
Storage order: column major
Queued Operations:
1. Load compressed matrix from directory ~/test
2. Select rows: all and cols: 1, 2 ... 20
Hi @bnprks, thanks in advance for continuing to maintain such a useful package!
I've been running into an issue with a bit of an edge case recently when trying to index an IterableMatrix with a list of 0 features. I do this, for example, when trying to get the percentage of cells that express mitochondrial features. Doing this with a sparse matrix would return a matrix with no rows, and although the IterableMatrix output seems to suggest that is what is happening, doing some downstream functions on the object shows this is not the case.
For an example: I create a test IterableMatrix w 20 rows and 20 cells
Subsetting the matrix with 0 features seems to correctly give a 0 x 20 matrix. However, if I get the
colSums
of the matrix its greater than 0 and converting this subsetted matrix to a sparse matrix leads to a strange result.Indexing with 1 or greater features works fine, this just occurs with 0. Thanks for any advice or solutions you may have!
Best, Gesi