gem / oq-engine

OpenQuake Engine: a software for Seismic Hazard and Risk Analysis
https://github.com/gem/oq-engine/#openquake-engine
GNU Affero General Public License v3.0
377 stars 273 forks source link

Augmenting rupture rates for classical PSHA with aftershocks #6692

Closed cossatot closed 1 year ago

cossatot commented 3 years ago

We are working on an implementation of classical PSHA with aftershocks.

The current approach that we are taking is to parse the source model, and place all of the aftershocks for each mainshock rupture on the other ruptures already in the SSM, yielding an additional rate increment for each rupture, conditional on the occurrence of the mainshock ( see https://gitlab.openquake.org/hazard/projects/2020/metis/-/blob/master/t442/openquake/aft/aftershock_probabilities.py). These rate increments are then converted to unconditional rates, and then summed for all of the mainshock ruptures. Finally we have an extra rate increment that describes the rate at which each rupture in the SSM will act as an aftershock. This value is to be added to the primary (mainshock) rupture rate before the calculation of the PoEs during a typical classical PSHA. All of the steps before the addition of the aftershock and mainshock rupture rates are performed during a pre-processing stage, with code that for now won't be included in the Engine (among other reasons, it introduces Numba as a dependency; whether to include it in the future is a separate discussion).

The additional rupture rates are stored in a CSV file, with a format like:

rate,source,rup_num
0.00045937636367182145,i17,0
0.00046102771506415717,i17,1
0.0004443637497429468,i17,2
0.00046275940241261834,i17,3
0.0004931585637198938,i17,4

where source is the source_id of the source that the rupture belongs to, and rup_num is the number of the rupture when the source is enumerated with iter_ruptures().

There does not need to be any configuration parameters for the job.ini except for the file path to the CSV file. The necessary scientific parameters are used during the pre-processing step. So the job.ini just needs an entry like:

[aftershocks]
rupture_occurrence_rate_adjustment_file = ./rup_adjustments.csv 

I have a very crude implementation in the Engine for reference here: https://github.com/gem/oq-engine/compare/rup_prob_adjustment. The code reads the rupture rate adjustments file, and then when any source is processed in openquake.hazardlib.contexts.ContextMaker.gen_ctxs(), the occurrence rate for that rupture is adjusted just before the contexts are created (https://github.com/gem/oq-engine/blob/99ae182965ddeb7d22e978eb365c9507650a0d3a/openquake/hazardlib/contexts.py#L403). This was the way that made the most sense to me, and it works fine except that because I don't understand how the sources are partitioned when they are split for parallel processing with Starmap, I had to turn off the source splitting (so that the ruptures can be correctly indexed by the rup_num), so it is quite inefficient. I don't expect that this code will be used in production, it is just a crude reference implementation.

I am happy to answer any questions and provide source models and rupture adjustment files for testing.

Thanks!

micheles commented 1 year ago

After 18 months I am back on this project.

The engine has changed a lot in this time and now there is a viable mechanism to implement the logic that you want.

Here is my plan:

  1. We need to implement an AftershockCalculator able to compute the rate adjustments. This requires moving your code inside hazardlib, @cossatot, and the call something like this in the calculator:
def build_rates(srcs):
    """
    :param srcs: a list of sources with the same TRT
    :returns: a DataFrame with fields (src_id, rup_id, delta)
    """
    out = {'src_id': [], 'rup_id': [], 'delta': []}
    for src in srcs:
        for i, rup in enumerate(src.iter_ruptures()):
            out['src_id'].append(src.id)
            out['rup_id'].append(src.offset + i)
            delta = calc_aftershock_delta_rate(src, rup)
            out['delta'].append(delta)
    out['src_id'] = numpy.uint32(out['src_id'])
    out['rup_id'] = numpy.uint32(out['rup_id'])
    out['delta'] = numpy.float32(out['delta'])
    return pandas.DataFrame(out)

calc_aftershock_delta_rate will go somewhere in hazardlib. Alternatively, a function returning a DataFrame with fields (src_id, rup_id, delta) would work.

  1. To use the AftershockCalculator we will need a pre_job.ini file like the following:
[general]
description = Aftershock test
calculation_mode = aftershock

[logic_tree]
source_model_logic_tree_file = source_model_logic_tree.xml
gsim_logic_tree_file = gmpe_logic_tree.xml
number_of_logic_tree_samples = 0

[source_params]
width_of_mfd_bin = 0.5
rupture_mesh_spacing = 20
area_source_discretization = 25
investigation_time = 50
maximum_distance = 300
  1. After running the aftershock calculation, we will run a classical calculation starting from the aftershock by using the --hc syntax; an .ini file like the following will suffice:
    
    [general]
    description = Classical with aftershocks
    calculation_mode = classical

[calculation] truncation_level = 3 investigation_time = 50 maximum_distance = 200.0 intensity_measure_types_and_levels = {"PGA": logscale(1E-2, 1, 25)} site_model_file = site_model.csv


I will change the classical calculator so that it will use the `delta_rates` in they are present in the parent calculation (i.e. the one referred by  the `--hc`).