Open rickhg12hs opened 6 years ago
I ran into a similar issue, and traced it down to the following. The tutorial gives this line of code:
julia> correct_prediction = indmax(y,2) .== indmax(y_, 2)
false
But as you can see, this doesn't return a Tensor
but the constant value false
. The intended line seems to have been:
julia> correct_prediction = equal(indmax(y,2), indmax(y_, 2))
<Tensor Equal:1 shape=unknown dtype=Bool>
and indeed, this allows me to reproduce the stated accuracy for the softmax model.
Recent changes to .
and broadcast
may be to blame, but I haven't dug into that (yet).
Thoughts?
Ah ya, that's possible. A PR fixing the documentation would be welcome.
I ran into a similar issue, and traced it down to the following. The tutorial gives this line of code:
julia> correctprediction = indmax(y,2) .== indmax(y, 2) false But as you can see, this doesn't return a Tensor but the constant value false. The intended line seems to have been:
julia> correctprediction = equal(indmax(y,2), indmax(y, 2))
and indeed, this allows me to reproduce the stated accuracy for the softmax model. Recent changes to . and broadcast may be to blame, but I haven't dug into that (yet). Thoughts?
Or is the true intention to overload ==
? ... Or something like:
julia> import Base.==
julia> ==(x::T, y::T) where T<:TensorFlow.Tensor = equal(x, y)
== (generic function with 275 methods)
Same issue reported at #364 .
Both the MNIST softmax regression model and the multi-layer convolutional network fail with accuracies of
0.0
.Interestingly, both the Basic usage and Logistic regression examples seem to work fine.
In addition, with my
libtensorflow.so
, both mnist_softmax.py and mnist_deep.py from TensorFlow's Python examples work fine with Python 2.7 and Python 3.6.