GraesonB / ChatGPT-Wrapper-For-Unity

A ChatGPT API wrapper for Unity
MIT License
263 stars 36 forks source link

Extending ChatGPTConversation #7

Closed p-v-z closed 1 year ago

p-v-z commented 1 year ago

Hi there πŸ‘‹

Firstly, thanks for the neat wrapper :) I've forked it and added it as a submodule to my project DialogueDreamland and it works nicely. ❀️

I don't want to modify your code, and I want to be able to always sync the latest changes, so I want to please ask if you would consider a change.

I plan on using different instances of ChatGPTConversation, so I want to be able to set them up during runtime (instead of defining them in the Editor). I've created my own custom class that extends ChatGPTConversation which allows me to set it up and check if it is ready. It looks like this:

public class CustomChatGPT : ChatGPTConversation
{
    public bool Ready => _selectedModel != null; 
    public void SetupGPT(string apiKey, string chatBotName, string initialPrompt, int maxTokens, float temperature)
    {
        _apiKey = apiKey;
        _chatbotName = chatBotName;
        _initialPrompt = initialPrompt;
        _maxTokens = maxTokens;
        _temperature = temperature;
    }
}

CustomChatGPT.cs

This method requires me to change those variables to protected though. I'm not sure if this is the best approach, but any variation that allows me to do this would be great.

Please let me know what you think πŸ™

GraesonB commented 1 year ago

Hi there :-) Absolutely a must-have feature for this wrapper, I'll work on it tonight.

GraesonB commented 1 year ago

Alright, so a couple of things: -The new ChatGPT API does not take the temperature or max_tokens properties, those only exist in the wrapper to support the older GPT completion models, so if you're using ChatGPT you don't need to worry about either of those. -Even if you are using the older GPT models, I don't think it would be useful to change either of those things at runtime. Those probably should just be set-and-forgets on your prefabs. -The _chatbotName variable also only applies to the older models. As far as I can tell I have to use 'role':'assistant' in my requests for ChatGPT's response messages, so there's no point in adding a name (You'll want to tell the model what it's name is in the _initialPrompt). -The API key will not change at runtime, so you don't need to worry about that either.

All of this to say, _initialPrompt is the only thing that needs to be changed at runtime. The initial prompt sets the entire context for the conversation, and also determines what character ChatGPT will be playing and what personality it has.

I just added a ResetChat method to the ChatGPTConversation script that takes a string as an argument and makes a new blank conversation and sets the _initialPrompt to the string argument.

public void ResetChat(string initialPrompt) {
    switch (_model) {
        case Model.ChatGPT:
            _chat = new Chat(initialPrompt);
            break;
        default:
            _prompt = new Prompt(_chatbotName, initialPrompt);
            break;
        }
}

Essentially, just subscribe that method to whatever event that is setting up your conversation at runtime and it should be good to go. I added demo buttons to the UI as well.

Let me know if this is sufficient for what you had in mind, or if you require something else :-)

p-v-z commented 1 year ago

Thank you. I've updated mine to use the ResetChat and got rid of max_tokens and temperature, and it works well.

This is probably a bit of an edge case, but I'm going to have players enter their own API keys before they can start playing (so that everything isn't tied to my account). I've remedied it with a one-liner which I don't mind maintaining in my forked repo. protected void SetApiKey(string apiKey) => _apiKey = apiKey;

Thanks for responding so quickly πŸ™Œ I'll let you know if I encounter more struggles. Cheers 🍻