ictnlp / Wait-info

Source code for our EMNLP 2022 paper "Wait-info Policy: Balancing Source and Target at Information Level for Simultaneous Machine Translation"
MIT License
7 stars 0 forks source link

arch is transformer ? #1

Open evi-Genius opened 1 year ago

evi-Genius commented 1 year ago

Hi, it's a very impressive work. But I get some question, is the training script correct since the parameter arch is 'transformer'? If so, where is the core model structure code?

vividfree commented 1 year ago

This parameter can be set according to the author's paper "Wait-info Policy- Balancing Source and Target at Information Level for Simultaneous Machine Translation". For example, "transformer_wmt_en_de" for de2en Transformer-base, and "transformer_vaswani_wmt_en_de_big" for de2en Transformer-big.