Open danaimone opened 2 months ago
Correction, it appears the exception is occurring due to max token selection. Would be nice to be able to support greater than 2048 tokens.
Have you tried https://github.com/simonw/llm by any chance? It has a chat mode too and is generally much better than ata if you ask me
Have you tried https://github.com/simonw/llm by any chance? It has a chat mode too and is generally much better than ata if you ask me
I haven't, thank you for the rec!
Currently if you try to use the gpt-4o model with >2048 token length, the following exception occurs upon prompting:
I am requesting that we add support for a great token length, as the pricing and performance is much better.
Reproduction: