-
Hi,
I succeeded in re-implementing the result in diverse_ted8_m2o setting. However, I failed to re-implement the result in diverse_ted8_o2m. The score of each language is 4 BLEU lower than the resul…
-
## 📚 Documentation
I'm trying to reproduce your result for your paper *Monotonic Multihead Attention*.
In the appendix of your paper, the hyperparameters for WMT'15 de-en are `encoder embed dim …
-
When running the command, `python preprocess.py --source-lang de --target-lang en \
--trainpref examples/translation/iwslt14.tokenized.de-en/train --validpref examples/translation/iwslt14.tokenized…
-
## ❓ Questions and Help
#### I am trying to run a simultaneous translation task using the following training command:
#### Code
`fairseq-train \
data-bin/ \
--simul-type hard_alig…
-
Hi, very nice paper, I like it! I have a question regarding your results on IWSLT'14. Are these values tokenized BLEU on `dev` or `test` set?
Dataset IWSLT’14 De-En
Enc #–Dec # 6L–6L (small)
Po…
-
When I use the training script `train.sh`, the following error is thrown -
```
+ nvidia-smi
NVIDIA-SMI has failed because it couldn't communicate with the NVIDIA driver. Make sure that the latest …
-
#### What is your question?
How do I use trained models in a different machine?
#### What have you tried?
1) In a remote machine, I have **trained a model and generate translations successful…
-
Hi,
In [this line](https://github.com/LiyuanLucasLiu/Transformer-Clinic/blob/master/fairseq/fairseq/modules/transformer_layer.py#L178), the variable `tmp_weight` is not defined. How should it be s…
-
## ❓ Questions and Help
#### What is your question?
I am running two versions of faiseq for neural machine translation, one is 0.6 and the other is 0.9, and find the data preprocessing results of …
-
Hi!
If we share decoder parameters in the multilingual transformer, we need to tell shared decoder in which language to decode.
It might be done by (embedding and) passing target language id di…