Open yufengm opened 6 years ago
Are we supposing the same length for each sentence in the position encoding?
https://github.com/dandelin/Dynamic-memory-networks-plus-Pytorch/blob/ad49955f907c03aade2f6c8ed13370ce7288d5a7/babi_main.py#L18
As in above, each sentence encoding is divided by the same number elen-1.
I suppose you could pad the variated sentences length to a fixed number in the preprocessing step before feeding them into the model.
Are we supposing the same length for each sentence in the position encoding?
https://github.com/dandelin/Dynamic-memory-networks-plus-Pytorch/blob/ad49955f907c03aade2f6c8ed13370ce7288d5a7/babi_main.py#L18
As in above, each sentence encoding is divided by the same number elen-1.