edward130603 / BayesSpace

Bayesian model for clustering and enhancing the resolution of spatial gene expression experiments.
http://edward130603.github.io/BayesSpace
Other
106 stars 21 forks source link

Clustering on univariate data #118

Open boyiguo1 opened 9 months ago

boyiguo1 commented 9 months ago

Thanks for the awesome algorithm and package.

I wonder if the algorithm could possibly be used to cluster based on one variable/feature.

For example, if I use d=1 in the spatialCluster function, would the model still work in theory?

library(BayesSpace)
melanoma <- getRDS(dataset="2018_thrane_melanoma", sample="ST_mel1_rep2")

set.seed(102)
melanoma <- spatialPreprocess(melanoma, platform="ST", 
                              n.PCs=7, n.HVGs=2000, log.normalize=FALSE)
melanoma <- spatialCluster(melanoma, q=2, platform="ST", d=1)

So far, I'm running into an error when trying the code above.

Neighbors were identified for 293 out of 293 spots. Error in [<-(*tmp*, g, modelName, value = bic(modelName = modelName, : subscript out of bounds

I'm not sure if this is an implementation problem, or a corner case that the model would not fit. I'm currently using version "1.12.0"

Thank you very much.

edward130603 commented 9 months ago

Thanks for trying the method. It seems like in this case, the mclust initialization fails. If we give it a new init, there's another implementation problem

> melanoma <- spatialCluster(melanoma, q=2, platform="ST", d=1, init = sample(c(1,2), size = ncol(melanoma), replace = T))
Neighbors were identified for 293 out of 293 spots.
Error in colMeans(Y) : 'x' must be an array of at least two dimensions

We can put a fix in for this in a future release but may take some time.

boyiguo1 commented 9 months ago

Thanks for your prompt response. It is great insight to know.

I acknowledge this is a corner case from a development perspective, and I'm not sure how many users would actually run into this case if any at all.

If you/the team decides to fix it, I would personally very much appreciate that!

Again, thank you very much!