Open JustinLin610 opened 6 years ago
@JustinLin610 Hello, did you solve this issue? I'm having the same problem.
@nave01314 Make sure you decoder_cell's number of layer same as the number of states you passing into it. It seems your example has two encoder layers (1 forward and 1 backward); however, your decoder only has 1 layer.
@oahziur I notice that the encoder_state is directly copy to decoder_initial_state. Does it make sense that the state from bi-directional lstm, which makes it two, can be applied to two layers of uni-directional lstm as initial state?
@liuyujia1991 It should be possible, although I haven't tested it myself.
I met this error when building the graph, and this is my code below for the encoding layer copied from stackoverflow, but it does not work for me...