facebookresearch / fairseq

Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
MIT License
30.19k stars 6.38k forks source link

How to use transformer LM for rescoring nbest lists #3774

Open RuABraun opened 3 years ago

RuABraun commented 3 years ago

What is your question?

In this issue https://github.com/pytorch/fairseq/issues/3080 alex talks about rescoring of nbest lists with a transformer LM. How can one do this with fairseq?

When doing fused decoding for me the results are worse with normal sized beams and when I try to use very large ones like 1000 I get OOM (I have 64G).

I have looked around in the fairseq and flashlight repository and have not been able to find anything for rescoring.

firdota commented 2 years ago

hello, did you make any progress?

RuABraun commented 2 years ago

Yes but I did my own implementation but only got a small improvement. Not sure if it's because of a mistake from me or because of some mismatch between my training data and the evaluation data.