google / langfun

OO for LLMs
Apache License 2.0
122 stars 18 forks source link

Langfun LLMs to set `max_tokens` to None by default #168

Closed copybara-service[bot] closed 5 months ago

copybara-service[bot] commented 5 months ago

Langfun LLMs to set max_tokens to None by default

For historical reasons, we set max_tokens to 1024 for all LLMs by default. This would often cause unexpected failures when we benchmark datasets which require more than 1K tokens as responses. With this CL, we opt in the default behavior defined by each LLM, usually meaning the longest tokens possible for generation.

Side effect: