quarkiverse / quarkus-langchain4j

Quarkus Langchain4j extension
https://docs.quarkiverse.io/quarkus-langchain4j/dev/index.html
Apache License 2.0
147 stars 88 forks source link

OpenAI max_token is deprecated #893

Open cescoffier opened 2 months ago

cescoffier commented 2 months ago
max_tokens (Deprecated):  The maximum number of [tokens](https://platform.openai.com/tokenizer) that can be generated in the chat completion. This value can be used to control [costs](https://openai.com/api/pricing/) for text generated via API.

This value is now deprecated in favor of max_completion_tokens, and is not compatible with [o1 series models](https://platform.openai.com/docs/guides/reasoning).

We would need to switch to the new parameter only if the provider is OpenAI.

geoand commented 2 months ago

cc @langchain4j

langchain4j commented 2 months ago

Just released openai4j 0.21.0 which includes max_completion_tokens

geoand commented 2 months ago

💪🏽