ading2210 / vercel-llm-api

A reverse engineered Python API wrapper for the Vercel AI Playground, which provides free access to many large language models without needing an account.
https://pypi.org/project/vercel-llm-api
GNU General Public License v3.0
159 stars 13 forks source link

Token limit? #15

Closed 5eroo closed 1 year ago

5eroo commented 1 year ago

The AI seems to cut off at about, according to tokenizer, 256 tokens. (even though default params state 500)

I've already tried changing the parameters, no success..

something like this doesn't work:

client.models["openai:gpt-3.5-turbo"]["parameters"]["maximumLength"]["value"] = 10000
client.model_defaults["openai:gpt-3.5-turbo"]["maximumLength"] = 10000

neither does this:

defaults['maximumLength'] = 5000 # modyfying source

So, is the token limit fixed by Vercel?

0x6a69616e commented 1 year ago

Vercel AI uses maxTokens instead of maximumLength within request payloads. Perhaps this may be causing the issue?

image