tensorlayer / TensorLayer

Deep Learning and Reinforcement Learning Library for Scientists and Engineers
http://tensorlayerx.com
Other
7.33k stars 1.61k forks source link

Some errors about cross_entropy #103

Closed Aceb1shmael closed 7 years ago

Aceb1shmael commented 7 years ago

Hi I want to try the network in tutorial_mnist_simple.py on a different data. And some errors appear when it goes to cost=tl.cost.cross_entropy(y,y_,name=''xentropy'')

Traceback (most recent call last): File "D:\Applications\Anaconda3\lib\site-packages\tensorlayer\cost.py", line 36, in cross_entropy return tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits(logits=output, targets=target)) TypeError: sparse_softmax_cross_entropy_with_logits() got an unexpected keyword argument 'targets'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "monster.py", line 45, in cost=tl.cost.crossentropy(y,y,name='xentropy') File "D:\Applications\Anaconda3\lib\site-packages\tensorlayer\cost.py", line 39, in cross_entropy return tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits(labels=target, logits=output, name=name)) File "D:\Applications\Anaconda3\lib\site-packages\tensorflow\python\ops\nn_ops.py", line 1709, in sparse_softmax_cross_entropy_with_logits (labels_static_shape.ndims, logits.get_shape().ndims)) ValueError: Rank mismatch: Rank of labels (received 2) should equal rank of logits minus 1 (received 2).

I`m not sure why this happen, thanks for any help

zsdonghao commented 7 years ago

hi, which TF version are you using. but I think it should work for both TF1.0 and others, how did you install TL? could you install TL again?

Aceb1shmael commented 7 years ago

I'm using tf 1.0 and tl 1.3.10, and i use pip to install package. i think both package work fine because the tutorial_mnist_simple.py works, the problem might be my data doesn`t conform the rules, but i don't know why it stop at this line or how to fix it

zsdonghao commented 7 years ago

@Aceb1shmael I see, if tutorial_mnist_simple.py work well, that should be a problem of your network output. Can you check the shape of the outputs? print(network.outputs)

As tl.cost.cross_entropy uses tf.nn.sparse_softmax_cross_entropy_with_logits, you should make sure the output and target shapes match it's requirement.

Aceb1shmael commented 7 years ago

@zsdonghao
for the output print(network.outps) Tensor("output_layer/Identity:0", shape=(?, 3), dtype=float32) and the y_train print(y_train.shape,y_val.shape) (278, 3) (93, 3) is this correct?

boscotsang commented 7 years ago

@Aceb1shmael The label y_train and y_val should be the shape of (278,) and (93,)

VisanXJ commented 7 years ago

Have you work it out? @Aceb1shmael