Alexander-H-Liu / End-to-end-ASR-Pytorch

This is an open source project (formerly named Listen, Attend and Spell - PyTorch Implementation) for end-to-end ASR implemented with Pytorch, the well known deep learning toolkit.
MIT License
1.18k stars 318 forks source link

Multihead attention #44

Closed d223302 closed 4 years ago

d223302 commented 5 years ago

I think there is an no attribute error in /src/asr.py line 437 : self.preprocess_mlp_dim. Should it be self.num_head? Thanks for your great and clear implementation.

Alexander-H-Liu commented 5 years ago

Hi @d223302 ,

You're right, multi-head attention implementation wasn't correct in the last version, We've update our code recently, I believe it's correct now. If you're interested, please have a look at line 138-235 in src/module.py

Thanks a lot for appreciating our work!