hfxunlp / transformer

Neutron: A pytorch based implementation of Transformer and its variants.
https://github.com/hfxunlp/transformer
GNU General Public License v3.0
63 stars 9 forks source link

more empirical results are expected #2

Closed alphadl closed 4 years ago

alphadl commented 4 years ago

Hi, Hongfei and Qiuhui, Thanks for your efforts~

From your description in paper, this implementation achieves 28.07 on WMT'14 ENDE. It's great~ Have you try other datasets to validate its effectiveness? Also, have you reproduced the followig variants with your code? i.e., Hier*, TA, SC, DOC

hfxunlp commented 4 years ago

@alphadl Hi, thanks a lot for your attention. we got some comparable BLEU after implementing them, we also have results on the other datasets but not reported in the draft paper. Unfortunately, we do not have the results of these models of the current commit, as we merged lots of features into the project and tuned many detailed configurations after the implementation of these approaches, neither do we have time to carefully maintain all these features, nor do we have GPU resources to run all related features after every commits. We also cannot access the original datasets of some papers.

alphadl commented 4 years ago

@anoidgit After EMNLP ddl, I would like to kindly reproduce the results of your implemented methods (Hier/ TA/ SC/ DOC ... ) on EN-DE dataset., and report the result here, such that this repository draw more NLPer's attention.

hfxunlp commented 4 years ago

@alphadl It is really kind of you, but I personally suggest they are more examples to show the others how to extend this project or how to implement their own ideas based on it than reproducing these approaches. The baseline is already high for productive use, and I do not see any other values unless you want to use these approaches as your research baseline. Good luck with your EMNLP :)

alphadl commented 4 years ago

I see ~ agree with you ! I will close this issue