Open deepakjois opened 1 week ago
This may have broken the Groq integration: https://github.com/tmc/langchaingo/pull/1013
Is it possible to continue supporting max_tokens please, and special casing this for the o1 series of models.
For context, here is the response from OpenAI: https://community.openai.com/t/why-was-max-tokens-changed-to-max-completion-tokens/938077
I encountered this error when using it in my Go code as well. So this isnt an issue with the example, but probably with the underlying API call?