huggingface / text-generation-inference

Large Language Model Text Generation Inference
http://hf.co/docs/text-generation-inference
Apache License 2.0
8.76k stars 1.02k forks source link

[Feature] Suppress tokens during generation #176

Open lvwerra opened 1 year ago

lvwerra commented 1 year ago

Similar to bad_words_ids in transformers it would be useful to be able to pass a set of tokens that are never sampled during generation.

alex-jw-brooks commented 1 year ago

Hello @lvwerra @OlivierDehaene - is anyone working on this? I am interested in contributing this enhancement

soulseen commented 9 months ago

@alex-jw-brooks hi, do you have any progress in this?

soulseen commented 9 months ago

@lvwerra @OlivierDehaene pls take a look. thanks

sapountzis commented 5 months ago

+1

andrewscrub commented 4 months ago

+1

meitalbensinai commented 1 month ago

+1