Closed florianhartig closed 3 years ago
The problem occurs in the importance function because only there the association matrix is actually calculated and together with the rowSums function it can blow up the memory solutions:
a) move importance function to torch -> use single precision (32bit) or even half-precision (16bit) + internal parallelization b) https://cran.r-project.org/web/packages/bigmemory/index.html + bigalgebra
solved in #65
Question from a user (redacted for conciseness and privacy):