Open nashid opened 4 years ago
Perhaps this tutorial: https://www.tensorflow.org/tutorials/text/nmt_with_attention
This implementation is too old. You can implement NMT with tf 2.0 easily. As @mommi84 mentioned, nmt_with_attention is an excellent tutorial.
If you want a Transformer
model, you can find it in my repo: transformers-keras
Thanks @luozhouyang.
I will definitely have a look at the nmt_with_attention tutorial.
Actually, at first I was thinking to use the model from that tutorial. However, I was not sure how that will trade as this repo seems to be authoritative and many scientific publications have cited this repo.
If you have any pointer related to this, do let me know.
I will look into the tutorial and would a comparison.
You're welcome, @nashid. 😅
I think the reason this repo seems authoritative is simply that it had been pioneering in the sequence-to-sequence era. It could be that today the accuracy you obtain will be higher than the new models', but likely not in the long term once tf 2.0 reaches the plateau of productivity.
There is no activity in this repo.
What would be better resource for building NMT models?