issues
search
bramses
/
chatgpt-md
A (nearly) seamless integration of ChatGPT into Obsidian.
MIT License
824
stars
61
forks
source link
Plugin limits max tokens
#96
Open
FalloutCrew
opened
7 months ago
FalloutCrew
commented
7 months ago
I am trying to use the model: gpt-4-1106-preview model, but the plugin seems to limit the max_tokens input to 4097. At least I get that error message, when the following is specified:
system_commands: ['I am a helpful assistant.'] temperature: 0 top_p: 1 max_tokens: 10000 presence_penalty: 1 frequency_penalty: 1 stream: true stop: null n: 1 model: gpt-4-1106-preview
I am trying to use the model: gpt-4-1106-preview model, but the plugin seems to limit the max_tokens input to 4097. At least I get that error message, when the following is specified:
system_commands: ['I am a helpful assistant.'] temperature: 0 top_p: 1 max_tokens: 10000 presence_penalty: 1 frequency_penalty: 1 stream: true stop: null n: 1 model: gpt-4-1106-preview