julywater / WeakLensing

1 stars 0 forks source link

mh_step() should take and return parameters as huge vector #5

Open davidwhogg opened 11 years ago

davidwhogg commented 11 years ago

Basically the input and output of the mh code should be identical to the emcee code.

For example, the returned chain should be nsamples x nparameters in size / shape.

davidwhogg commented 11 years ago

I want the call

X_new = mh_step(X, inds, sigmas)

X is the current parameter vector inds is a list of integers saying which of the parameters should be sampled in this step sigmas is a list of root-variances of the gaussians for the proposal distribution for the parameters listed in inds

This function will try to take a step just in the parameters listed in inds.

If you iterate over choices for inds, you will be Gibbs sampling. Does that make sense?

julywater commented 11 years ago

I thought about this way. But the inconvenient part is that while sampling Gamma in patch i only data in patch i will be used. while sampling P(shape parameter ) all the data will be used. I don't know how to use a general function to do that yet.

2012/12/20 David W. Hogg notifications@github.com

I want the call

X_new = mh_step(X, inds, sigmas)

X is the current parameter vector inds is a list of integers saying which of the parameters should be sampled in this step sigmas is a list of root-variances of the gaussians for the proposal distribution for the parameters listed in inds

This function will try to take a step just in the parameters listed in inds.

If you iterate over choices for inds, you will be Gibbs sampling. Does that make sense?

— Reply to this email directly or view it on GitHubhttps://github.com/julywater/WeakLensing/issues/5#issuecomment-11560563.

julywater commented 11 years ago

Oh,maybe I can do it by some change in the postfunction(),I'll think about it tomorrow.

2012/12/20 yike tang tangyike455665@gmail.com

I thought about this way. But the inconvenient part is that while sampling Gamma in patch i only data in patch i will be used. while sampling P(shape parameter ) all the data will be used. I don't know how to use a general function to do that yet.

2012/12/20 David W. Hogg notifications@github.com

I want the call

X_new = mh_step(X, inds, sigmas)

X is the current parameter vector inds is a list of integers saying which of the parameters should be sampled in this step sigmas is a list of root-variances of the gaussians for the proposal distribution for the parameters listed in inds

This function will try to take a step just in the parameters listed in inds.

If you iterate over choices for inds, you will be Gibbs sampling. Does that make sense?

— Reply to this email directly or view it on GitHubhttps://github.com/julywater/WeakLensing/issues/5#issuecomment-11560563.

davidwhogg commented 11 years ago

I don't think it matters that you are using only part of the data. The passing of data is not what is making the code slow.

julywater commented 11 years ago

I wrote a gibbs sampler based on a general mh_step() function as you said. Then I find a big problem I can't get rid of the loop for i in xrange(NP) (MH_step() for each patch) All the possible way I came up with have conflicts between sampling gammas and alpha...

Then I thought It may be better to write different mh_sampler for gamma and alpha sampling as I did last time. It's more direct and won't be slower.

in https://github.com/julywater/WeakLensing/blob/master/code/MH.py I have both versions

davidwhogg commented 11 years ago

A for loop that goes over a big piece of code is not as bad as an "inner loop". But if you can do the MH loop with a map() operation, that's good because then we can use multiprocessing to parallelize it.