koustuvsinha / hred-py

Pytorch implementation of Hierarchical Encoder Decoder Sequence to Sequence Model
39 stars 11 forks source link

Batch Feed ? #3

Closed gmftbyGMFTBY closed 5 years ago

gmftbyGMFTBY commented 5 years ago

Hi, thanks for your contribution and your wanderful code for HRED. But through reading you model file, I have some question.

  1. The training function seems to be a no batch version which means that the model is fed into a instance of one dialogue for feed forward process. But i am not sure about it. Am i right ?
  2. And, if training is no batch, is it possible for me to train it with feeding forward a list of sentence, and it may be a batch version HRED ?

I really need your help, thank you very much !

gmftbyGMFTBY commented 5 years ago

Oh, I just found the solution, refer to this repo Code-Mixed-Dialg, his reproduction seems to be a batch version HRED, and the input shape for the model is [batch, max_context_len, max_seq_len], and the output shape is [batch, max_seq_len]. And the performance of his model is good.