kognetiks / kognetiks-chatbot

Kognetiks Chatbot for WordPress is a plugin that allows you to effortlessly integrate OpenAI's ChatGPT API into your website, providing a powerful, AI-driven chatbot for enhanced user experience and personalized support.
https://kognetiks.com/wordpress-plugins/
GNU General Public License v3.0
39 stars 23 forks source link

No system message - more difficult to get bot to follow finetuning #59

Open BAM-Software-Development opened 3 months ago

BAM-Software-Development commented 3 months ago

The current chat completion api combines the "system message" and the chat history into one message and sends it as the first user message. This leads the responses to not follow finetuning very well.

I don't have the code handy, but I am sure you can find it if you feel like this is something worth working on.

I took the current instructions you put in the admin panel and where they are set as $context prior to being concat with the message history and the kflow stuff I havent even started to dive into... I changed that to a new variable. I made a new line in the message list before sending the api call that does role system content $sys_message - then where the context and prior messages were concatted, I added "we previously have been talking about those following things:" and concatted with the previous messages and kflow stuff.

This leads to an output from the model that is closer to the epected behavior of the system prompt and finetuning done when using a finetuned model. It also seems to follow the instrutions better even without the finetuned model though the effect is not as dramatic.

BAM-Software-Development commented 3 months ago

In order to test this using the current version of the plugin that doesn't poll for available models, i set the model name directly in the database and now it seems to be constantly in the list, even after I update to your new code. SO that's kinda cool but unintended. Should the pluglin be removing old entries from the database when it fetches the lists from the api?

kognetiks commented 3 months ago

The implemention in version 1.9.5 has been expanded and groups them by gpt, dall, and tts (and whisper coming soon). The model list is pulled directly from OpenAI using an API call to their server. If you add a model to the plugin DB, I'm not sure how it's surviving past one iteration of retrieval - since the model will include it once. I'm thinking if you save with one of the OpenAI models, the custom model will vanish from the list.

As for "we previously have been talking about those following things:" I think I know where to add that into version 1.9.5. I've added a line of code where I think it needs to go in the chatbot-call-gpt-api.php line 89-90 in version 1.9.5. I've push the working code to the repo. I'll be testing this later tonight and tomorrow for release to production shortly.

BAM-Software-Development commented 3 months ago

It might be worth testing different strings there or giving a menu entry to set that… I just had a fine tuned model where I tuned it for different behavior based on specific system prompts so being able to send a certain string as role system was required for my fine tune to respond properly.

We previously have been talking about the following things, just felt natural for the types of input my model receives as a prefix to the previous message history and the “rag” input.

I haven’t dug into the kflow code yet… and am not using it yet. It appears to be the same intent as the rag apps that are pulling semantically related vectors from a vectordb… is that accurate?

Get Outlook for iOShttps://aka.ms/o0ukef


From: Stephen Howell @.> Sent: Friday, April 12, 2024 4:09:26 PM To: kognetiks/kognetiks-chatbot @.> Cc: founder matthews @.>; Author @.> Subject: Re: [kognetiks/kognetiks-chatbot] No system message - more difficult to get bot to follow finetuning (Issue #59)

The implemention in version 1.9.5 has been expanded and groups them by gpt, dall, and tts (and whisper coming soon). The model list is pulled directly from OpenAI using an API call to their server. If you add a model to the plugin DB, I'm not sure how it's surviving past one iteration of retrieval - since the model will include it once. I'm thinking if you save with one of the OpenAI models, the custom model will vanish from the list.

As for "we previously have been talking about those following things:" I think I know where to add that into version 1.9.5. I've added a line of code where I think it needs to go in the chatbot-call-gpt-api.php line 89-90 in version 1.9.5. I've push the working code to the repo. I'll be testing this later tonight and tomorrow for release to production shortly.

— Reply to this email directly, view it on GitHubhttps://github.com/kognetiks/kognetiks-chatbot/issues/59#issuecomment-2052535593, or unsubscribehttps://github.com/notifications/unsubscribe-auth/BHIFMLSQMVFIQGURVR6RYT3Y5BEQDAVCNFSM6AAAAABF2AZF3SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDANJSGUZTKNJZGM. You are receiving this because you authored the thread.Message ID: @.***>