For instance the EUR model has 302_990_625 realizations and trying to build the realizations causes cole to run out of memory (512 GB) after 2.5 hours. Now building the weights takes only 25 minutes and requires < 100 GB of RAM.
However, the performance is dominated by full_lt.get_rlzs_by_gsim which is pretty slow and memory consuming.
Here is the time spent in full_lt.init() for New Zealand with ~1M realizations:
| old vs new | time_sec | memory_mb | counts |
|------------------+----------+-----------+--------|
| building full_lt | 54.5 | 639.6 | 1 |
| building full_lt | 52.9 | 365.8 | 1 |
For instance the EUR model has 302_990_625 realizations and trying to build the realizations causes cole to run out of memory (512 GB) after 2.5 hours. Now building the weights takes only 25 minutes and requires < 100 GB of RAM. However, the performance is dominated by
full_lt.get_rlzs_by_gsim
which is pretty slow and memory consuming. Here is the time spent infull_lt.init()
for New Zealand with ~1M realizations:The memory is nearly halved.