stephenslab / mashr

An R package for multivariate adaptive shrinkage.
https://stephenslab.github.io/mashr
Other
88 stars 19 forks source link

Error in extreme_deconvolution_rcpp when using the strong tests to set up data-driven covariances. #125

Open Gongmian784 opened 3 months ago

Gongmian784 commented 3 months ago

Hi, I am trying to use the strong tests to set up data-driven covariances but getting the eror below:

> U.pca = cov_pca(data.strong,5)
> U.ed = cov_ed(data.strong, U.pca)

Error in extreme_deconvolution_rcpp(ydata, tycovar, projection, logweights,  : 
  long vectors not supported yet: ../../src/include/Rinlinedfuns.h:537
Calls: cov_ed ... extreme_deconvolution -> extreme_deconvolution_rcpp
Execution halted

Can you please explain me what could be the issue here and how to fix it?

Thanks, Mian

pcarbo commented 3 months ago

@Gongmian784 It sounds like the call to cov_pca? So the issue is with the call to cov_ed? How large is data.strong? That is, how many rows and columns are in your Bhat matrix?

Gongmian784 commented 3 months ago

Thank you for your reply. I get the data.strong using following command:

data.strong = mash_set_data(Bhat=zval_data.strong,alpha=1, V=Vhat, zero_Bhat_Shat_reset=1e6)

The Bhat matrix contains 879,516 rows and 51 columns. The missing values were set to zero.

pcarbo commented 3 months ago

@Gongmian784 I think unfortunately cov_ed does not work with very large data sets due to some specifics of the implementation. It could probably be fixed, but this is not originally our code; it is borrowed from the original "extreme deconvolution" R package. What I would suggest is to take a random subset of the rows of Bhat, and see if it works with this random subset. I'm not exactly sure how much smaller the subset needs to be, so this may involve some trial and error.