minimaxir / simpleaichat

Python package for easily interfacing with chat apps, with robust features and minimal code complexity.
MIT License
3.48k stars 228 forks source link

Continuing conversation with context from AIChat output_schema #79

Open rasmi opened 1 year ago

rasmi commented 1 year ago

In OpenAI's official docs on function calling, they give a pattern of:

  1. Call the model
  2. Get function parameters as structured data
  3. Call function
  4. Call the model again, this time including context from the function call, to get a natural language response.

With simpleaichat's AIChat class, I am doing something that looks like:

  1. Call the model
  2. Get structured data out using output_schema

At this point, I would like to call the model again as in Step 4 above, except with the structured data from AIChat's output_schema as context, in order to produce a natural language response.

So:

  1. Call the model
  2. Get structured data out using output_schema
  3. ???
  4. Call the model again, this time including structured data from Step 2 as context, to get a natural language response.

Is the preferred/recommended way to do this to using the AIChat class? Ideally I could do this in the very same conversation/instance rather than create a new conversation/instance to handle context.

In the OpenAI example, they do the following:

        # Step 4: send the info on the function call and function response to GPT
        messages.append(response_message)  # extend conversation with assistant's reply
        messages.append(
            {
                "role": "function",
                "name": function_name,
                "content": function_response,
            }
        )  # extend conversation with function response
        second_response = openai.ChatCompletion.create(
            model="gpt-3.5-turbo-0613",
            messages=messages,
        )  # get a new response from GPT where it can see the function response
        return second_response

Is it recommended to (for example) send a message to the model using the function role, even if no such functions were defined? In this case, I'm effectively using the AIChat instance itself with output_schema as a function. I would just like to get a natural language response in addition to the structured output_schema response.

pyrotank41 commented 1 year ago

how abot you use AIChat without the output_schema as the input to the function call?

rasmi commented 1 year ago

I think the underlying goal here is to have one chat session that exclusively consists of natural language inputs and outputs in its history, but still be able produce structured input/output "under the hood" for any given message.

As an example:

This is reminiscent of what gen_with_tools does, except rather than have the library manage the use of tools, I would like it to run a specific set of functions every time and return structured context outputs for each user message, then unstructured Assistant outputs using the structured context from the functions (like a "hook").

The approached used in gen_with_tools is to make one call to extract the structured output (and call the tool), then update the system prompt to force use of the new context, then make a second call that includes the context alongside the original prompt to produce a response (with save_messages=False). It seems this is a valid approach -- just adding "Context: <context> , User: <original prompt>" as the user prompt in a second call to the model. I suppose I could do this manually as in gen_with_tools, but having a simpler way to do this directly in AIChat would be helpful.

Maybe something like this would suffice for now (non-functional code, just sketching it out):

assistant = simpleaichat.AIChat(...)
prompt = <user input>
# Initial model call to get structured output/context
context = assistant(prompt, output_schema=..., save_messages=False)
# <call other functions to act on context>

# Create natural language response including context
prompt_with_context = f"Context: {context}\n\nUser: {prompt}"
response = assistant(prompt_with_context, save_messages=False)

# Save original message and final response
user_message = ChatMessage(role="user", content=prompt)
assistant_message = ChatMessage(role="assistant", content=response)
assistant.get_session().add_messages(user_message, assistant_message)