Harry-Westwood / Y4-Project-InterNeuralStellar

Bayesian Hierarchical Modelling and Machine Learning of Stellar Populations
1 stars 0 forks source link

HBM prob 3: getting Mm and Ms in the model #7

Closed HinLeung622 closed 4 years ago

HinLeung622 commented 4 years ago

@grd349 In problem 3 of the HBM problems, K varies with some function of M_true, and the premise of the problem is that we already know M good enough that we do not care to constraint it with proper hyperpriors and stuff like we are doing with K values, correct?

If that is the case, then when setting up the model,

with model:
    alpha=pm.Normal('alpha',-1.61,0.5)
    beta=pm.Normal('beta',0,10)
    sigma=pm.Lognormal('sigma',np.log(0.03),0.5)

    Mtrue=pm.Normal('Mtrue',Mm,Ms,shape=N)
    Mobs=pm.Normal('Mobs',Mtrue,M_obs_sigma,observed=M_obs)

    Ktrue=pm.Normal('Ktrue',trueK(alpha,beta,Mtrue),sigma,shape=N)
    Kobs=pm.Normal('Kobs',Ktrue,K_obs_sigma,observed=K_obs)

we will need to acquire the population mean Mm and spread Ms from somewhere that is not a "true value only known to us due to this being a synthesized population". My guess is to get Mm by taking the mean of M_obs, but how could we get the spread Ms(As taking the std of M_obs will include the observational uncertainty)? A guess of mine is that we treat Mm and Ms as some values that is known to us already if this was a real batch of data from some previous studies conducted on the dataset, that we are confident they are true enough to not do the whole HBM constraint on them, perhaps this is the point?

for full notebook, see https://github.com/Harry-Westwood/Fourth-Year-Project/blob/master/Hin's_files/pymc3_test/testpymc3_3.ipynb

grd349 commented 4 years ago

HI @HinLeung622

I'm not sure I understand - let's chat at our meeting today.

HinLeung622 commented 4 years ago

solution discussed in meeting today (week 6 autumn)