LieberInstitute / spatialLIBD

Code for the spatialLIBD R/Bioconductor package and shiny app
http://LieberInstitute.github.io/spatialLIBD/
78 stars 16 forks source link

Add support for non-Visium data and/or SpatialFeatureExperiment #61

Open lcolladotor opened 9 months ago

lcolladotor commented 9 months ago

Right now spatialLIBD is built upon the SpatialExperiment container class. There is also https://bioconductor.org/packages/SpatialFeatureExperiment/ that can support non-Visium spatially-resolved data.

As new technologies come online, we will see which container class(es) adapt the most and become widely adopted for these data types. We will likely need to refactor and/or update spatialLIBD to support these other classes. This might involve rewriting several functions to use S4 classes.

Once these internal updates are complete, we should make a new guide (vignette) describing how to use spatialLIBD for each of the new data types that we add support for.

lcolladotor commented 9 months ago

Some examples of non Visium data include Xenium, a new platform by 10x Genomics that is currently being rolled out.

cathalgking commented 4 months ago

FYI, SpatialExperimentIO::readXeniumSXE() will form an SPE from 10x Xenium data. However, I am not sure if that would work with this package.

If I construct my SPE manually (with count matrix, image and spatial co-ords) can I use spatialLIBD::run_app()? If not, what is the bare minimum I need to add to the SPE in order for it to run? https://www.bioconductor.org/packages/release/bioc/vignettes/SpatialExperiment/inst/doc/SpatialExperiment.html#21_Manually

@lcolladotor

lcolladotor commented 4 months ago

Hi @cathalgking,

We currently haven't worked on adding support for Xenium data to spatialLIBD. This issue is part of our proposed future plans that we submitted for a grant application, that was unfortunately unsuccessful.

We likely will circle back to adding Xenium once my team starts analyzing Xenium data, but that's currently not the case. So you might figure out things before us.

Best, Leo