odashi / chainer_examples

Example usages of Chainer for natural language processing.
Other
120 stars 26 forks source link

Attention Mechanism is Very Slow #2

Open prajdabre opened 8 years ago

prajdabre commented 8 years ago

Hi, Your attention mechanism is quite slow. Since you compute the linear projections (aw and bw) each time although they do not change, the time is almost quadratic.

I have implemented a faster version of attention which does a lot of precomputation and would like to push it as soon as I am done testing.

Regards.

odashi commented 8 years ago

Thank you, I had overlooked this issue and I think it is no problem to fix about that calculation.

Regards,

prajdabre commented 8 years ago

Hi, I have implemented the fast version of attention and have tested it. I will push it tomorrow or so.