Open jcarlen opened 6 years ago
thanks for this report, working on it...
This seems to be the same problem that has been fixed by #47. Indeed, a dense double-precision matrix of this size would occupy 940498 * 8713 * 8/1e9
i.e. 65 GB in memory, so it's no wonder the computer gave up.
In general, it's pretty hard to reliably warn about memory issues. It would be annoying to have a function that keeps on quitting because it thinks it doesn't have enough memory, whereas in reality the OS could have found some space, e.g., if other processes terminate or free large blocks of memory.
Aaron's bug fixes may have fixed this for you...Sorry it took so long!
Attempting prcomp_irlba crashes R (3.3.2). I updated to the latest version of irlba (2.3.2) to make sure that wasn't the issue.
prcomp_irlba(x.sparse, n = 50, retx = F, center = T, scale = F)
Where x.sparse is 940498 x 8713 sparse matrix of dummy vars
This works fine:
irlba(x.sparse, nv = 50, nu = 0, center = colMeans(x.sparse), right_only = TRUE)
If this is a memory issue is there a way to issue a warning and end the computation instead of the crash?
Thanks!