Closed bb33bb closed 5 months ago
@bb33bb "Max tokens" is for specifying how much tokens a model can generate at max, the max value for gpt-4-1106-preview is 4096 tokens. You should change it to 4096, and you already have Max Context set up properly.
@bb33bb "Max tokens" is for specifying how much tokens a model can generate at max, the max value for gpt-4-1106-preview is 4096 tokens. You should change it to 4096, and you already have Max Context set up properly.
thanks for your tips. I am going to try it.
@bb33bb "Max tokens" is for specifying how much tokens a model can generate at max, the max value for gpt-4-1106-preview is 4096 tokens. You should change it to 4096, and you already have Max Context set up properly.
totaly right. Sorry to bother u
the other software named "better chatgpt" is ok and will not pop error
I build the latest version of KoalaClient , which pop this alert.