Conchylicultor / DeepQA

My tensorflow implementation of "A neural conversational model", a Deep learning based chatbot
Apache License 2.0
2.93k stars 1.17k forks source link

parameter of tf.contrib.legacy_seq2seq.sequence_loss() #143

Open iamabug opened 7 years ago

iamabug commented 7 years ago

I am reading model.py and the following code confuses me (line 197-203):

self.lossFct = tf.contrib.legacy_seq2seq.sequence_loss( decoderOutputs, self.decoderTargets, self.decoderWeights, self.textData.getVocabularySize(), softmax_loss_function= sampledSoftmax if outputProjection else None )

And the definition of the function is:

def sequence_loss(logits, targets, weights, average_across_timesteps=True, average_across_batch=True, softmax_loss_function=None, name=None)

As it goes, self.textData.getVocabularySize() (a scalar number) should be passed to average_across_timesteps. Is it okay here or I misunderstand somewhere?