JakobRobnik / MicroCanonicalHMC

MCHMC: sampler from an arbitrary differentiable distribution
GNU General Public License v3.0
68 stars 9 forks source link

Sampling with constrained parameter #43

Open HanWang2021 opened 9 months ago

HanWang2021 commented 9 months ago

What should I do if I sample models with parameters having flat priors? Because MCHMC needs to know the gradient and in this case the gradient will go to infinity...

JakobRobnik commented 9 months ago

Do you mean flat priors or flat posteriors?

The prior is really not relevant and there is no problem with prior being flat (we initialize the particle from the prior. If you don't like this for some reason, you can easily do some other initialization by using the x_initialize parameter for the sampling function).

If the posterior is flat its gradient will be zero and MCHMC and HMC particles will just fly in straight lines, bouncing off the domain walls. In this case it doesn't make much sense to use such samplers as there are much easier ways to do sampling. If your prior domain is a cube, just use uniform number generator. If it is more complicated there are other methods available.

Jakob

On Thu, Nov 23, 2023, 15:46 Han Wang @.***> wrote:

What should I do if I sample models with parameters having flat priors? Because MCHMC needs to know the gradient and in this case the gradient will go to infinity...

— Reply to this email directly, view it on GitHub https://github.com/JakobRobnik/MicroCanonicalHMC/issues/43, or unsubscribe https://github.com/notifications/unsubscribe-auth/AKIPD4EQTD75HXKMDDI4HP3YF5OS5AVCNFSM6AAAAAA7X4JUQ2VHI2DSMVQWIX3LMV43ASLTON2WKOZSGAYDQMZWHE3TAMY . You are receiving this because you are subscribed to this thread.Message ID: @.***>

HanWang2021 commented 9 months ago

Hi,

thanks! I mean the flat prior. Just a few parameters in the model have flat priors. The posterior distribution in the high dimensions and the shape should be quite complicated. My question is that if the sampler samples at the edge at a flat prior and the likelihood at that place does not go to zero. On the outside of the prior, the posterior suddenly drops to 0. Doesn't it cause an issue for the MCHAMC sampling?

I am trying to use the bijector from tensorflow_probability.substrates.jax to transfer the unconstrained parameter into the constrained space. But after adding this, I got

TypeError: If shallow structure is a sequence, input must also be a sequence. Input has type: <class 'jax._src.interpreters.batching.BatchTracer'>

So the bijectors in tensorflow_probability.substrates.jax seem to be incompatible with MCHAMC?

On 2023-11-23 15:56, Jakob Robnik wrote:

Do you mean flat priors or flat posteriors?

The prior is really not relevant and there is no problem with prior being flat (we initialize the particle from the prior. If you don't like this for some reason, you can easily do some other initialization by using the x_initialize parameter for the sampling function).

If the posterior is flat its gradient will be zero and MCHMC and HMC particles will just fly in straight lines, bouncing off the domain walls. In this case it doesn't make much sense to use such samplers as there are much easier ways to do sampling. If your prior domain is a cube, just use uniform number generator. If it is more complicated there are other methods available.

Jakob

On Thu, Nov 23, 2023, 15:46 Han Wang @.***> wrote:

What should I do if I sample models with parameters having flat priors? Because MCHMC needs to know the gradient and in this case the gradient will go to infinity...

— Reply to this email directly, view it on GitHub https://github.com/JakobRobnik/MicroCanonicalHMC/issues/43, or unsubscribe

https://github.com/notifications/unsubscribe-auth/AKIPD4EQTD75HXKMDDI4HP3YF5OS5AVCNFSM6AAAAAA7X4JUQ2VHI2DSMVQWIX3LMV43ASLTON2WKOZSGAYDQMZWHE3TAMY

. You are receiving this because you are subscribed to this thread.Message ID: @.***>

-- Reply to this email directly, view it on GitHub [1], or unsubscribe [2]. You are receiving this because you authored the thread.Message ID: @.***>

Links:

[1] https://github.com/JakobRobnik/MicroCanonicalHMC/issues/43#issuecomment-1824570038 [2] https://github.com/notifications/unsubscribe-auth/AUQ7AOR7XUSOB5J4YCOLKNDYF5PZLAVCNFSM6AAAAAA7X4JUQ2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMRUGU3TAMBTHA

JakobRobnik commented 9 months ago

No, our code is not compatible with tfp. We have just implemented the code to support adding constraints by imposing periodic or reflective boundary conditions at the prior borders (the code is not in pip yet, but you can directly clone it from git, see tutorial). You can check out the tutorial here: https://github.com/JakobRobnik/MicroCanonicalHMC/blob/master/notebooks/tutorials/Constraints.ipynb . Please let us know how this solution works for you.

Best, Jakob

On Thu, Nov 23, 2023 at 7:03 PM Han Wang @.***> wrote:

Hi,

thanks! I mean the flat prior. Just a few parameters in the model have flat priors. The posterior distribution in the high dimensions and the shape should be quite complicated. My question is that if the sampler samples at the edge at a flat prior and the likelihood at that place does not go to zero. On the outside of the prior, the posterior suddenly drops to 0. Doesn't it cause an issue for the MCHAMC sampling?

I am trying to use the bijector from tensorflow_probability.substrates.jax to transfer the unconstrained parameter into the constrained space. But after adding this, I got

TypeError: If shallow structure is a sequence, input must also be a sequence. Input has type: <class 'jax._src.interpreters.batching.BatchTracer'>

So the bijectors in tensorflow_probability.substrates.jax seem to be incompatible with MCHAMC?

On 2023-11-23 15:56, Jakob Robnik wrote:

Do you mean flat priors or flat posteriors?

The prior is really not relevant and there is no problem with prior being flat (we initialize the particle from the prior. If you don't like this for some reason, you can easily do some other initialization by using the x_initialize parameter for the sampling function).

If the posterior is flat its gradient will be zero and MCHMC and HMC particles will just fly in straight lines, bouncing off the domain walls. In this case it doesn't make much sense to use such samplers as there are much easier ways to do sampling. If your prior domain is a cube, just use uniform number generator. If it is more complicated there are other methods available.

Jakob

On Thu, Nov 23, 2023, 15:46 Han Wang @.***> wrote:

What should I do if I sample models with parameters having flat priors? Because MCHMC needs to know the gradient and in this case the gradient will go to infinity...

— Reply to this email directly, view it on GitHub https://github.com/JakobRobnik/MicroCanonicalHMC/issues/43, or unsubscribe

< https://github.com/notifications/unsubscribe-auth/AKIPD4EQTD75HXKMDDI4HP3YF5OS5AVCNFSM6AAAAAA7X4JUQ2VHI2DSMVQWIX3LMV43ASLTON2WKOZSGAYDQMZWHE3TAMY

. You are receiving this because you are subscribed to this thread.Message ID: @.***>

-- Reply to this email directly, view it on GitHub [1], or unsubscribe [2]. You are receiving this because you authored the thread.Message ID: @.***>

Links:

[1]

https://github.com/JakobRobnik/MicroCanonicalHMC/issues/43#issuecomment-1824570038 [2]

https://github.com/notifications/unsubscribe-auth/AUQ7AOR7XUSOB5J4YCOLKNDYF5PZLAVCNFSM6AAAAAA7X4JUQ2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMRUGU3TAMBTHA

— Reply to this email directly, view it on GitHub https://github.com/JakobRobnik/MicroCanonicalHMC/issues/43#issuecomment-1824786933, or unsubscribe https://github.com/notifications/unsubscribe-auth/AKIPD4FFRCE7IOHD7II24HDYF6FW3AVCNFSM6AAAAAA7X4JUQ2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMRUG44DMOJTGM . You are receiving this because you commented.Message ID: @.***>