torch / cunn

Other
215 stars 174 forks source link

Try to apply softmax to a batch of data with variable length #489

Open hfxunlp opened 6 years ago

hfxunlp commented 6 years ago

Hi, I want to make SoftMax support variable length input, so you can use a batch of data with different length as the input of this module. This is helpful for Natural Language Processing, especially for the Attention model of seq2seq and Attention-over-Attention model for reading comprehension. And this pull request is corresponding to https://github.com/torch/nn/pull/1297.

hfxunlp commented 6 years ago

Sorry, There is still a bug that I have to fix

hfxunlp commented 6 years ago

I fix the bug in a weird way in the second commit. It will be wonderful if someone can tell me why and show a more beautiful solution.