Open Gongmian784 opened 4 months ago
@Gongmian784 It sounds like the call to cov_pca
? So the issue is with the call to cov_ed
? How large is data.strong
? That is, how many rows and columns are in your Bhat
matrix?
Thank you for your reply. I get the data.strong using following command:
data.strong = mash_set_data(Bhat=zval_data.strong,alpha=1, V=Vhat, zero_Bhat_Shat_reset=1e6)
The Bhat matrix contains 879,516 rows and 51 columns. The missing values were set to zero.
@Gongmian784 I think unfortunately cov_ed does not work with very large data sets due to some specifics of the implementation. It could probably be fixed, but this is not originally our code; it is borrowed from the original "extreme deconvolution" R package. What I would suggest is to take a random subset of the rows of Bhat, and see if it works with this random subset. I'm not exactly sure how much smaller the subset needs to be, so this may involve some trial and error.
Hi, I am trying to use the strong tests to set up data-driven covariances but getting the eror below:
Can you please explain me what could be the issue here and how to fix it?
Thanks, Mian