lsdefine / attention-is-all-you-need-keras

A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need
702 stars 188 forks source link
attention-is-all-you-need attention-seq2seq deep-learning keras keras-tensorflow

The Transformer model in Attention is all you need:a Keras implementation.

A Keras+TensorFlow Implementation of the Transformer: "Attention is All You Need" (Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin, arxiv, 2017)

Usage

Please refer to en2de_main.py and pinyin_main.py

en2de_main.py

Upgrades

Acknowledgement