Closed 5eroo closed 1 year ago
The AI seems to cut off at about, according to tokenizer, 256 tokens. (even though default params state 500)
I've already tried changing the parameters, no success..
something like this doesn't work:
client.models["openai:gpt-3.5-turbo"]["parameters"]["maximumLength"]["value"] = 10000 client.model_defaults["openai:gpt-3.5-turbo"]["maximumLength"] = 10000
neither does this:
defaults['maximumLength'] = 5000 # modyfying source
So, is the token limit fixed by Vercel?
Vercel AI uses maxTokens instead of maximumLength within request payloads. Perhaps this may be causing the issue?
maxTokens
maximumLength
The AI seems to cut off at about, according to tokenizer, 256 tokens. (even though default params state 500)
I've already tried changing the parameters, no success..
something like this doesn't work:
neither does this:
So, is the token limit fixed by Vercel?