Open mpetteno opened 4 months ago
@srvasude it looks like the power
parameter of the PowerTransform bijector is not permitted to be a Variable. Do you know why?
Probably a combination of being an ancient bijector and this line: https://github.com/tensorflow/probability/blob/65f265c62bb1e2d15ef3e25104afb245a6d52429/tensorflow_probability/python/bijectors/power_transform.py#L96
I guess from a performance perspective you'd want to still maintain the power==0 static branch.
indeed, the Exp bijector is just implemented as power transform with power=0. we could decouple these, exp would be very simple on its own. or we could keep the static path for efficiency but also allow a tensor input. i started removing the static path but then realized exp was using it...will take another look later.
Hi everyone, I'm trying tofind the optimal lambda parameter of the
PowerTransform
bijector with maximum lilkelihood estimation. In order to do so I had to modify the constructor of the bijector in order to allowpower
to be a trainabletf.Variable
. Then code is the following:The estimation with the MLE from the scipy library gives
lambda = 0.25
but mine giveslambda = 0.64
. If I use the bijector with a static value of 0.25 I can recover a distribution that is closer to the original exponential so I believe that there might be a problem with the training procedure or with the computation of the forward Jacobian in thePowerTransform
bijector but I can't find it.Anyone can help with this?