ryansmcgee / seirsplus

Models of SEIRS epidemic dynamics with extensions, including network-structured populations, testing, contact tracing, and social distancing.
MIT License
658 stars 216 forks source link

Scalability and complexity? #26

Closed XinyiYS closed 4 years ago

XinyiYS commented 4 years ago

Very interesting and thorough implementation of SEIRs modeling, and I find the Stochastic approach particularly promising in complex real-life scenarios.

One question though, since the simulation is with respect to the number of nodes in the graph, I suppose the complexity depends on it, too. May I ask how the complexity(memory and runtime) scales with the number of nodes, or some guidelines on how many number of nodes can be included before the experiment becomes prohibitively computationally infeasible (on a decent PC)?

Edit: according to Specifying Interaction Networks section, it is suggested to use 10,000 nodes as a trade-off between small stochastic volatility and realism.

ryansmcgee commented 4 years ago

Thank you for your question, I apologize for the very delayed response. Runtime does scale super linearly with network size, which can lead to infeasibly long run times for large networks (more than ~50k nodes), which is something I've hoped to improve for a while. The stochastic simulation uses the Gillespie method, and I'm aware of the existence of "adaptive tau" and other adaptive methods that can often greatly speed up Gillespie sims, but I haven't had a chance to explore the feasibility of these methods for these particular models. This is a ripe area for contribution from an adaptive Gillespie expert :)

XinyiYS commented 4 years ago

Thanks for the response!