Open codertimo opened 2 years ago
Thanks for your suggestion, we'll consider it.
On Thu, Mar 10, 2022 at 10:36 AM Junseong Kim @.***> wrote:
Suggestion
It would be great to support various sampling configuration like huggingface generate() function
{ "top_p": 0.3, "min_generated_tokens": 3, "max_generated_tokens": 20, "num_return_sequences": 1, "repetition_penalty": 1.2, "length_penalty": 1.0, "no_repeat_ngram_size": 3, }
— Reply to this email directly, view it on GitHub https://github.com/bytedance/lightseq/issues/283, or unsubscribe https://github.com/notifications/unsubscribe-auth/AELIZANGL3BDJN4YSUKTXM3U7FN35ANCNFSM5QLJQLPA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
also like num_return_sequences etc, it will be very much helpfull if the api totally same as huggingface generate() function
You can try multiple_output=True
in infer() when using with transformer
model.
On Mon, Jul 4, 2022 at 4:41 PM hanguangmic @.***> wrote:
also like num_return_sequences etc, it will be very much helpfull if the api totally same as huggingface generate() function
— Reply to this email directly, view it on GitHub https://github.com/bytedance/lightseq/issues/283#issuecomment-1173530074, or unsubscribe https://github.com/notifications/unsubscribe-auth/AELIZAPESIQ3TGRFYI4P5P3VSKPULANCNFSM5QLJQLPA . You are receiving this because you commented.Message ID: @.***>
Suggestion
It would be great to support various sampling configuration like huggingface generate() function