Closed HQJo closed 3 years ago
Hi,
Thanks for your interest.
So there are two separate things: 1. coarsening and 2. sparsification. For coarsening, our paper shows the idea of how to preserve key graph spectral properties (i.e., first few Laplacian eigenvectors) via smoothing the random vectors. We leverage the LAMG solver for the actual implementation. For more LAMG solver details, you can check the paper that we cited: https://arxiv.org/abs/1108.1310. For sparsification, it is only useful when the coarsest graph is very dense, which is more likely happening for large graphs with large reduction ratio (e.g., Friendster dataset). Thus, we later find that it is not necessary to use GRASS to sparsify the graph for those small graphs (e.g., Cora, Citeseer, and Pubmed). If you work on a very dense coarsest graph, please try GRASS, which is open sourced: https://sites.google.com/mtu.edu/zhuofeng-graphspar/home.
Thanks for your reply! I understand it now.
About the graph coarsening step, in the paper, it is said that
which refers to this paper: [1] Zhuo Feng. Similarity-aware spectral sparsification by edge filtering. Design Automation Conference (DAC), pp. 1–6, 2018.
In the README of this repo, it is said that LAMG algorithm is used. And I have not found much information about LAMG in the above paper [1].
Where can I find more information about this LAMG algorithm used in the program?