For very large models, memory requirements become demanding. Would it be possible to formally state the scaling laws in the documentation or even provide functions to estimate the absolute memory usage? Of particular interest is the impact of the total number of bands to be solved, as that came as a surprise to us. If this is not possible analytically, I would be happy to contribute some empirical results on a research cluster for a proposed test suite.
Scaling laws are discussed in Section 3.4-3.5 of the MPB Optics Express paper.
I also think it could be nice to have some simple variant of those in the documentation, for quick reference.
For very large models, memory requirements become demanding. Would it be possible to formally state the scaling laws in the documentation or even provide functions to estimate the absolute memory usage? Of particular interest is the impact of the total number of bands to be solved, as that came as a surprise to us. If this is not possible analytically, I would be happy to contribute some empirical results on a research cluster for a proposed test suite.