RainEggplant / chatgpt-telegram-bot

A ChatGPT bot for Telegram based on Node.js. Support both browserless and browser-base APIs.
MIT License
322 stars 97 forks source link

Usage of promptPrefix in latest version? #31

Closed xobosox closed 1 year ago

xobosox commented 1 year ago

Hi, great bot, well done

I'm testing out the PromptPrefix, PromptSuffix, etc and they don't seem to be doing anything. Below are the contents of my local.config file.. the rest of the fields seem fine, except for the prompt. I'm trying to make it similar to how the prompt: ""works in the node.js code that you can download from the OpenAI playground.

Here's the contents of my local.json file:

        "debug": 1,
        "bot": {
                "token": "hidden",
                "groupIds": [],
                "userIds": [],
                "chatCmd": "/chat"
        },
        "api": {
                "type": "official",
                "official": {
                        "promptPrefix": "I am a highly intelligent question answering bot. If you ask me a question that is rooted in truth, I will give you the answer. If you ask me a question that is nonsense, trickery, or has no clear answer, I will respond with \"Unknown\".\n\nQ: What is human life expectancy in the United States?\nA: Human life expectancy in the United States is 78 years.\n\nQ: Who was president of the United States in 1955?\nA: Dwight D. Eisenhower was president of the United States in 1955.\n\nQ: Which party did he belong to?\nA: He belonged to the Republican Party.\n\nQ: What is the square root of banana?\nA: Unknown\n\nQ: How does a telescope work?\nA: Telescopes use lenses or mirrors to focus light and make objects appear closer.\n\nQ: Where were the 1992 Olympics held?\nA: The 1992 Olympics were held in Barcelona, Spain.\n\nQ: How many squigs are in a bonk?\nA: Unknown\n\nQ: What's the square root of 9?\nA: The square root of 9 is 3.\n\nQ: Why? Unknown",
                        "promptSuffix": "\n\nAI:\n",
                        "assistantLabel": "AI",
                        "userLabel": "You",
                        "apiKey": "hidden",
                        "apiBaseUrl": "",
                        "completionParams": {},
                        "systemMessage": "",
                        "maxModelTokens": 0,
                        "maxResponseTokens": 0
                }
        },
        "proxy": ""
xobosox commented 1 year ago

Further to this, I turned verbose logging on, and it doesn't look like the prompt from the config file is sent:

sendMessage (45 tokens) {
  max_tokens: 1000,
  model: 'gpt-3.5-turbo',
  temperature: 0.8,
  top_p: 1,
  presence_penalty: 1,
  messages: [
    {
      role: 'system',
      content: 'You are ChatGPT, a large language model trained by OpenAI. Answer as concisely as possiboe.\n' +
        'Current date: 2023-03-03\n'
    },
    { role: 'user', content: 'Hello', name: undefined }
  ],
  stream: true
}
xobosox commented 1 year ago

I just did a compare of the latest version to v2.1.1.. looks like the updates for the PromptPrefix were removed from the latest. Is there an issue with their functionality?

RainEggplant commented 1 year ago

I just did a compare of the latest version to v2.1.1.. looks like the updates for the PromptPrefix were removed from the latest. Is there an issue with their functionality?

The API for the ChatGPT models (such as gpt-3.5-turbo) slightly differs from the previous text completion models (such as text-davinci-003). The upstream API migrated to the ChatGPT models, so previous models are no longer supported. Maybe you can create an issue in the API repo about bringing it back.

As for now, the api.official.systemMessage option works like the previous promptPrefix options, but there seem to be no alternatives to the other options.

xobosox commented 1 year ago

Got it, thanks. I edited /chatgpt/build/index.js and updated this._systemMessage (line 131) and managed to get a similar result to the 'prompt' method.

RainEggplant commented 1 year ago

Got it, thanks. I edited /chatgpt/build/index.js and updated this._systemMessage (line 131) and managed to get a similar result to the 'prompt' method.

You can actually use the api.official.systemMessage option in the config file without modifying the code :)