Closed hyunjimoon closed 5 months ago
with lognormal and different driving data scale (initial order rate)
### 1) ode parameter prior model.set_prior("inventory_adjustment_time", "normal", 2, 0.4) # heuristic of 1/5 model.set_prior("minimum_order_processing_time", "normal", 0.05, 0.01) #### 2) sampling distribution parameter (measruement error) prior model.set_prior("phi", "inv_gamma", 2, 0.1) # mean beta, alpha-1 ### 3) measurement \tilde{y}_{1..t} ~ f(\theta, t)_{1..t} model.set_prior("work_in_process_inventory_obs", "neg_binomial_2", "work_in_process_inventory", "phi") model.set_prior("inventory_obs", "neg_binomial_2", "inventory", "phi")
returns the scale of mean 46, standard error of 5
mean becomes 3.8e+27 with inverse-gamma(2, 5) whose mean is 2
Tom's #91 might be relevant.
don't use lognormal for deviation param
neg_binom seem to be less scale-dep
with lognormal and different driving data scale (initial order rate)
distribution
returns the scale of mean 46, standard error of 5
normal (mu, (mu/5)^2) for parameters' prior (heuristic)
lognormal for measurement distribution (likelihood, similar to neg_binom; except this handles continuous)
inverse-gamma(2, 0.1) for sigma's prior (anything but lognormal; zero and extremity avoiding)
mean becomes 3.8e+27 with inverse-gamma(2, 5) whose mean is 2