cogentapps / chat-with-gpt

An open-source ChatGPT app with a voice
https://www.chatwithgpt.ai
MIT License
2.3k stars 493 forks source link

Hard-coded `maxTokens`? #83

Open Equim-chan opened 1 year ago

Equim-chan commented 1 year ago

https://github.com/cogentapps/chat-with-gpt/blob/1998a661f805669e6c81ea6c31eadfe9b7cbf75b/app/src/openai.ts#L90

I think it should be 4096 for gpt-3.5-turbo and 8192 for gpt-4?

cogentapps commented 1 year ago

My plan is to make the context length a setting the user can customize. With GPT-4 in particular, maxing out the 8K (or 32K) context for every message is probably not needed most of the time.

ludzeller commented 1 year ago

Yes, having it as a setting would be great.

That would make it possible to prompt close to the possible maximum of tokens, e.g. when you don't expect a long conversation and long responses. E.g. for summaries.

cogentapps commented 1 year ago

Max prompt tokens is now configurable under Settings -> Chat.