Open MartinusR opened 5 years ago
Hi.
I came across same situation. Look below code,
input_dim = 3
output_dim = 1
model = tfk.Sequential([
tfkl.InputLayer([input_dim]),
tfkl.Dense(tfpl.IndependentNormal.params_size(output_dim)),
tfpl.IndependentNormal(output_dim)
])
The model
's __call__
method return tfd.Distribution
especially in this case tfd.Independent
. So, in below code
X = tf.constant([[3., -4., 1.],
[2., 2., 2.],
[-5., 3., 1.]])
Y = model(X)
the Y
is instance of tfd.Independent
and we expect to get a sample from this distribution with sample()
method.
Y.sample()
# <tf.Tensor: id=555, shape=(3, 1), dtype=float32, numpy=
# array([[-9.038388 ],
# [ 1.0602483],
# [ 7.4560723]], dtype=float32)>
It is expected behavior. Of course mean()
method also behaves correctly. And with numpy()
method, we get numpy.array
.
However, using tfp.layers.DenseFlipout
input_dim = 3
output_dim = 1
model = tfk.Sequential([
tfkl.InputLayer([input_dim]),
tfpl.DenseReparameterization(tfpl.IndependentNormal.params_size(output_dim)),
tfpl.IndependentNormal(output_dim)
])
X = tf.constant([[3., -4., 1.],
[2., 2., 2.],
[-5., 3., 1.]])
Y = model(X)
the Y
become an instance tfd.Independent
(This is expected behavior), but Y.sample()
method return tf.Tensor
which do NOT executed eagerly. In other words, I think, this tf.Tensor
is a tensor like in tf.Graph
with TF1.X.
Y.sample()
<tf.Tensor 'Reshape_9:0' shape=(3, 1) dtype=float32>
I don't know why this is caused. In the same way mean()
method return tf.Tensor
which is NOT executed. So, this tf.Tensor
doesn't have numpy()
method (Just as tf.Tensor
did not have numpy()
method in TF1.X).
@hellocybernetics is right: <some distribution>.mean().numpy()
is only expected to work in Eager mode or in TensorFlow 2.0 (which is Eager by default). If fact, I'm surprised @MartinusR found the given example worked with the DenseVariational
layer -- perhaps a tf.enable_v2_behavior()
or tf.enable_eager_execution()
got lost somewhere?
Anyway, closing this as "working as designed". If you still have an issue, feel free to reopen, or start a new one.
@axch I don't understand: you say it is supposed to work in TF 2.0, though I used TF 2.0?
@MartinusR You're right, that is weird. This bears further investigation.
As @hellocybernetics described, it seems that calling mean()
on the result of tfp.layers.DenseFlipout
is not returning an executed tensor with a value and a numpy()
like I expected it in TF2.0, but is returning a non-executed tensor like in TF1.X.
Maybe this has not been updated yet to TF2.0, or it requires to enable something in TF Probability?
It does seem so. I am not sure how that could happen, though, which is what bears investigation.
The DenseFlipout
and DenseReparameterization
are the older layers API and we haven't spent much effort on getting it to work in TF 2. However, we have started writing a new variational layers API that does work in TF 2, so if possible I'd recommend using it instead. It is not yet feature complete, but it is where our future work will be concentrated. See my comment here for some details: https://github.com/tensorflow/probability/issues/409#issuecomment-492870964
Is this issue still outstanding? Or the Flipout
is now depreciated?
Hi, I am trying to replace the
DenseVariational
layer withDenseFlipout
in the Probabilistic Layers Regression Notebook.Though, this causes
model(x_tst).mean().numpy()
to fail:AttributeError: 'Tensor' object has no attribute 'numpy'
.DenseVariational
orkeras.layers.Dense
, everything works fine.DenseReparameterization
,DenseLocalReparameterization
andDenseFlipout
, I get theAttributeError
Am I missing something?
I am using the nightly versions:
Here is a minimal example (to run in the Colab):