Closed Agnes007 closed 6 years ago
Your issue sounds very similar to issue https://github.com/stingergraph/stinger/issues/170, which was fixed with PR https://github.com/stingergraph/stinger/pull/208. If you're inserting a lot of edges at once you can use stinger_batch_insert_edges
instead of stinger_insert_edge
to get a large performance improvement.
Thank you for your advice. I have another question. I found that the 1G memory can handle 94584 vertices(the upper bound), but the data contains 1 million vertices and the program continuous computing. I wonder how STINGER handle the vertices which exceeds the upper bound. Dropping them?
If you compile with assertions enabled (Debug mode), stinger_insert_edge
will check for out-of-bounds vertices. stinger_batch_insert_edges
does not check, so probably these edges are not being inserted. To be safe, you should check against stinger_max_nv
before inserting.
Any other questions here?
NO,Thank you very much!
I mad the STINGER process data which size is 10G in 32G memory. At the beginning of the STINGER operation, it ran fast at the speed of 3MB/S, but with the size of the processed data became larger , the speed decreased, and after the size of processed data was 1.5g, the speed was 0.5MB/S and remains unchanged. No matter how many algorithms I took or the data how large in the end, the speed won't change anymore. I am very interested that why the speed slow down and after that how can it keeps stable? Is there any methods that I can take to keep STINGER running at high speed?