Data4DM / BayesSD

Data for Decision, Affordable Analytics for All
8 stars 0 forks source link

Experiment on prior tail #22

Closed hyunjimoon closed 5 months ago

hyunjimoon commented 1 year ago

don't use lognormal for deviation param

neg_binom seem to be less scale-dep

with lognormal and different driving data scale (initial order rate)

image

distribution

    ### 1) ode parameter prior
    model.set_prior("inventory_adjustment_time", "normal", 2, 0.4)  # heuristic of 1/5
    model.set_prior("minimum_order_processing_time", "normal", 0.05, 0.01)

    #### 2) sampling distribution parameter (measruement error) prior
    model.set_prior("phi", "inv_gamma", 2, 0.1) # mean beta, alpha-1

    ### 3)  measurement \tilde{y}_{1..t} ~ f(\theta, t)_{1..t}

    model.set_prior("work_in_process_inventory_obs", "neg_binomial_2", "work_in_process_inventory", "phi")
    model.set_prior("inventory_obs", "neg_binomial_2", "inventory", "phi")

returns the scale of mean 46, standard error of 5

normal (mu, (mu/5)^2) for parameters' prior (heuristic)

lognormal for measurement distribution (likelihood, similar to neg_binom; except this handles continuous)

inverse-gamma(2, 0.1) for sigma's prior (anything but lognormal; zero and extremity avoiding)

image

mean becomes 3.8e+27 with inverse-gamma(2, 5) whose mean is 2

hyunjimoon commented 1 year ago

67 is relevant as convolution of prior and likelihood can be designed to be one likelihood.

hyunjimoon commented 1 year ago

Tom's #91 might be relevant.