thu-coai / DA-Transformer

Official Implementation for the ICML2022 paper "Directed Acyclic Transformer for Non-Autoregressive Machine Translation"
Other
119 stars 15 forks source link

Implement Viterbi decoding #5

Closed shaochenze closed 1 year ago

shaochenze commented 1 year ago

Great research and thanks for opening source code! We implement Viterbi decoding algorithms in this pull request, which can find the output that maximizes P(A,Y|X) / |Y|^{beta}. The speed of Viterbi decoding is slightly slower than Lookahead. The performance of Viterbi decoding is between Lookahead and beam search.

@inproceedings{shao2022viterbi,
  author = {Chenze Shao and Zhengrui Ma and Yang Feng},
  title = {Viterbi Decoding of Directed Acyclic Transformer for Non-Autoregressive Machine Translation},
  booktitle = {Findings of EMNLP 2022},
  year = {2022}
}