Closed sharanry closed 6 years ago
Half-Cauchy is a common prior, so it'd be great to have this explicitly. The implementation should be fairly straightforward, following the half-normal distribution.
As a workaround, in defining log-densities for inference you can generally also just use the Cauchy density, assuming you're already using variable transformations to constrain your inferences to be positive (which typically has nicer properties than just relying on -inf's in the log density).
How about:
class HalfCauchy(tfp.distributions.TransformedDistribution):
def __init__(self, loc, scale, validate_args=False):
super(HalfCauchy, self).__init__(
distribution=tfp.distributions.Cauchy(loc, scale, validate_args=validate_args),
bijector=tfp.bijectors.AbsoluteValue(validate_args=validate_args))
Wouldn't you need to apply the AbsoluteValue in a standardized space? Something like,
bijector = tfp.bijectors.Chain([
tfp.bijectors.AffineScalar(shift=loc),
tfp.bijectors.AbsoluteValue(),
tfp.bijectors.AffineScalar(shift=-loc)])
Yes, but the problem is that Chain doesnt know how to work with non-injective bijectors such as AbsoluteValue. It might be possible to add this capability, and if so would be a very nice improvement.
In my comment, I should have indicated that this isnt HalfCauchy but should have the desired similar properties, esp if all you need is a prior.
FYI, we have a PR for this internally coming soon.
Half-Cauchy distribution is a requirement for the well known centered eight schools model.
pymc4 issue: https://github.com/pymc-devs/pymc4/issues/12