Closed stnorton closed 3 years ago
Thanks for your interest in the 'amen' package. I suspect that the issue is with performing an svd on a matrix that large. You might try if you can get the R function 'svd' will work on your sociomatrix, without using the 'amen' package first.
That being said - 'amen' uses an iterative MCMC algorithm, and at every step some linear algebra will need to be done. I suspect the size of your sociomatrix will be too large for 'amen' to handle. You might try instead a moment-based approach.
I am trying to run
ame()
on a large weighted network, but the model always segfaults before burn-in begins. This has happened across 5 different tries, but I've only gotten an informative error once that seems to suggest it has something to do with the singular value decomposition in the multiplicative effects portion of the model.The network has 44,007 nodes, resulting in an adjacency matrix that takes up ~16 GB in memory. Edges are weighted are are positive real values. I believe this is a function of the size of the model because the same model will begin sampling on a subgraph of 8000 nodes. The process that does not appear to be running out of memory - if I provide it with 1 TB of RAM, it never uses more than around 400 GB.
The
ame()
call that segfaults:The error:
And the output from
sessionInfo()
- the CPU architecture is not the same on every node, so libopenblas may be compiled against a different architecture on different runs: