lucidrains / x-transformers

A concise but complete full-attention transformer with a set of promising experimental features from various papers
MIT License
4.86k stars 419 forks source link

Feature request: generate top k sequences #133

Open yzhang-github-pub opened 1 year ago

yzhang-github-pub commented 1 year ago

Instead of generate the best (top 1) sequence from autoregressive generate function, it is useful to generate >1 best sequences. For example, in language translation, human users can have choices from more than 1 best translations.

Thanks for your consideration.

lucidrains commented 1 year ago

@yzhang-github-pub do you mean beam search?

yzhang-github-pub commented 1 year ago

@lucidrains Not exactly. Using language translation as an example. The model generates one best target sentence, either greedily or by beam search. What I ask is to generate top k target sentences.

maryszmary commented 2 weeks ago

For me, it would actually be great if generate had a beam search option, since I need multiple outputs ranked by probability for character-level generation of phonetic transcriptions.

lucidrains commented 2 weeks ago

@maryszmary yea i should add classic beam search at some point for sure

maybe roll up any interesting advancements in that area of research with its release? lemme know