Open adamkucharski opened 1 year ago
To give a tangible case study of where this kind of requirement has been needed in reality, early COVID scenario modelling collated a distribution of R0 from published estimates (see below), then simulated trajectories over a distribution of values for R0 (based on a summary parametric distribution) to provide uncertainty bounds for projections.
As well as calculating a profile likelihood, it would also be possible to numerically calculate the posterior probability for the 1 or 2 parameter likelihoods considered in
quickfit
.A couple of potential use cases:
datadelay
underascertainment analysis, could combine uncertainty in the existing 'true' CFR estimate (perhaps extracted from mean and CI usingepiparameter
functionality) with uncertainty in the estimated real-time CFR, adjusting for delays, to get a more robust estimate of underascertaintment.bpmodels
, or a posterior forR
fromepisoap
, then passing toepidemics
and/orscenarios
to simulate different outbreak dynamics.Then there's question of how best to store such a posterior, e.g. whether as a numerical vector, or perhaps as a fitted kernel
density()
object?Some example code of an implementation that could align with existing functions in
quickfit
(Poisson has a conjugate prior, of course, so could make calculation simpler for this specific example, but idea is to show how might work for a general likelihood function and prior):