Open deepakdalakoti opened 4 years ago
I'm no expert, but if I understand correctly, it looks to me as if you could use tfp.distributions.JointDistributionSequential
for this. It seems to implement the product rule, and given that your statistical model is p(mu, sigma)=p(mu|sigma) x p(sigma)
with the two probabilities given by a normal distribution and a Gamma distribution, this would be exactly what you need.
So all you would need to to is define your model as something like
def statistical_model(params):
mu, sigma = tf.split(params, 2, axis=1)
joint = tfd.JointDistributionSequential([
tfd.Normal(loc=mu, scale=sigma),
lambda sigma: tfd.Gamma(concentration=a1, rate=b1)
])
return joint
instead of normal_sp(params)
.
I haven't really tried this, though, and I'm not sure whether tf.split()
is the right idea here.
/edit: I think my brain was some place else when I wrote this. Fixed several errors in what I wrote...
@deepakdalakoti Did you manage to use a hierarchical prior? If yes, I would be interested in how you did this.
Hi,
I am trying to use tensorflow probability to learn a Bayesian neural networks. I want to learn the responses y_t based on input features x_t, i.e.
y_t = f(x_t) + eps
where f(x_t) is the output of the neural network and eps models aleatoric uncertainty. As a first step, I assume all weights have a Gaussian prior with zero mean and unit variance while eps is modelled as a zero mean and unit variance noise. This can be achieved using the following examples taken from https://colab.research.google.com/github/tensorchiefs/dl_book/blob/master/chapter_08/nb_ch08_03.ipynb
I can then train this network. I, however, want a gamma prior for the variance of the noise parameter eps, i.e.
eps ~ N(0,sigma) sigma ~ Gamma(a1,b1)
How can I implement this in the TensorFlow probability framework? I think I need to add another neuron on the last DenseFlipout layer and change the prior and posterior functions to a function that samples from a product of a normal and gamma distribution. However not sure exactly how to implement this.