farizrahman4u / seq2seq

Sequence to Sequence Learning with Keras
GNU General Public License v2.0
3.17k stars 845 forks source link

Add embedding layer to Seq2Seq model #206

Open azinnai opened 7 years ago

azinnai commented 7 years ago

Hi, I'm trying to add an embedding layer at the begging of the encoder but I cannot figured out an effective way to do this.

Does someone have an idea?

Thanks

fredtcaroli commented 6 years ago

Define the embedding layer yourself and then call the model on it

inp = Input(...)
emb = Embedding(...)(inp)
output = Seq2Seq(...)(emb)
model = Model(inp, output)
zhongluwang commented 6 years ago

@fredtcaroli if input_Y is shape(batchsize timestep),output is shape(batchsize timestep * outputdim),then how to calculate the loss?