wszlong / sb-nmt

Code for Synchronous Bidirectional Neural Machine Translation (SB-NMT)
64 stars 17 forks source link
machine-translation sb-nmt transformer

Synchronous Bidirectional Neural Machine Translation

This is the official codebase for the following paper, implemented in tensorflow:

Long Zhou, Jiajun Zhang, Chengqing Zong. Synchronous Bidirectional Nueral Machine Translaiton. In Transactions of ACL 2019. [PDF]

Requirements

  1. python2.7
  2. tensorflow-gpu >=1.4
  3. cuda >=8.0

Usage

  1. Preprocessing. construct pseudo training data using Transformer as introduced in the paper, and then run ./datagen.sh.
  2. Training. run ./train.sh.
  3. Inference. run ./test.sh.

Citation

If you found this code useful in your research, please cite:

@article{Zhou:2019:TACL,
  author    = {Zhou, Long and Zhang, Jiajun and Zong, Chengqing},
  title     = {Synchronous Bidirectional Nueral Machine Translaiton},
  journal   = {Transactions of the Association for Computational Linguistics},
  year      = {2019},
}

Contact

If you have questions, suggestions and bug reports, please email wszlong@gmail.com or long.zhou@nlpr.ia.ac.cn.