amritasaha1812 / CSQA_Code

59 stars 20 forks source link

`decoder_loss` runtime error #10

Closed hugochan closed 6 years ago

hugochan commented 6 years ago

Hi,

I got a runtime error when training the model. The error was caused by this statement: https://github.com/amritasaha1812/CSQA_Code/blob/0b297bd78937de11747399144b750e8284c6f3cd/hierarchy_model.py#L267

It turns out that the variable logits is a list of tensors instead of a single tensor, as a result, applying tf.nn.softmax to the list raised an error.

Could you please tell me how to fix this? Thank you so much!

vardaan123 commented 6 years ago

This code is written in TF 0.10.0. Please use that version to run this.

hugochan commented 6 years ago

Hi @vardaan123 , I was actually using TF 0.10.0 to run this. Not sure if there is any version of TF which supports to apply softmax to a list of tensors.

hugochan commented 6 years ago

Well, after digging into the code a little bit, I guess the fix is to apply the softmax op to each element of the list. So it is prob = [tf.nn.softmax(x) for x in logits] . That solved my problem.