DongjunLee / conversation-tensorflow

TensorFlow implementation of Conversation Models
143 stars 27 forks source link

Bug fix : Cloning encoder state to decoder #13

Closed yanghoonkim closed 6 years ago

yanghoonkim commented 6 years ago
DongjunLee commented 6 years ago

Thanks for your pull request. Now i know there is a bug with clone state. :)

I have a question. Why did you also edit this codes?

self.num_layers = int(self.num_layers / 2)
if self.num_layers == 0:
    self.num_layers = 1
yanghoonkim commented 6 years ago

Because the shape of states(encoder and decoder) should be the same, or not it will raise error. self.decoder_initial_state.clone(...) won't raise error in your case. but if you assign this back to the self.decoder_initial_state as I wrote: self.decoder_initial_state = self.decoder_initial_state.clone(...), it will. In bidirectional_encoder case, if I have num_layer == 2, the model will assign encoder.num_layer == 2/1, however, the num_layer of decoder retains the same size. This is the reason I edit those codes.

DongjunLee commented 6 years ago

Oh, I get it. :) Thank you very much.