debanjandhar12 / logseq-chatgpt-plugin

A tightly integrated ChatGPT plugin for Logseq.
GNU Affero General Public License v3.0
108 stars 10 forks source link

many tokens in completion #27

Closed yangguang760 closed 1 year ago

yangguang760 commented 1 year ago

why there are so many tokens in completion?

image

debanjandhar12 commented 1 year ago

What is the current value set for Max Tokens in the plugin settings? Also, what model are you using?

image


The plugin works by deducting the current message length and setting the reminder as max length that the model is allowed to return. [This means that the model does not use max length even if it is allowed to do so. OpenAI however disallows specifying max length value over their model's limit.]

yangguang760 commented 1 year ago

image image

debanjandhar12 commented 1 year ago

Setting the ChatGPT Max tokens to 4000 should fix the issue.

yangguang760 commented 1 year ago

@debanjandhar12 Thanks a lot!