jackschedel / KoalaClient

The best LLM API Playground Interface (for me)
https://client.koaladev.io/
Creative Commons Zero v1.0 Universal
26 stars 8 forks source link

max token pop alert #105

Closed bb33bb closed 5 months ago

bb33bb commented 5 months ago

image

the other software named "better chatgpt" is ok and will not pop error

I build the latest version of KoalaClient , which pop this alert.

ghost commented 5 months ago

@bb33bb "Max tokens" is for specifying how much tokens a model can generate at max, the max value for gpt-4-1106-preview is 4096 tokens. You should change it to 4096, and you already have Max Context set up properly.

bb33bb commented 5 months ago

@bb33bb "Max tokens" is for specifying how much tokens a model can generate at max, the max value for gpt-4-1106-preview is 4096 tokens. You should change it to 4096, and you already have Max Context set up properly.

thanks for your tips. I am going to try it.

bb33bb commented 5 months ago

@bb33bb "Max tokens" is for specifying how much tokens a model can generate at max, the max value for gpt-4-1106-preview is 4096 tokens. You should change it to 4096, and you already have Max Context set up properly.

totaly right. Sorry to bother u