Open rasmi opened 1 year ago
how abot you use AIChat without the output_schema as the input to the function call?
I think the underlying goal here is to have one chat session that exclusively consists of natural language inputs and outputs in its history, but still be able produce structured input/output "under the hood" for any given message.
As an example:
<produces structured output of Fruit which is then used to look up prices>
<updates Cart object with fruit, quantity, total cost>
This is reminiscent of what gen_with_tools does, except rather than have the library manage the use of tools, I would like it to run a specific set of functions every time and return structured context outputs for each user message, then unstructured Assistant outputs using the structured context from the functions (like a "hook").
The approached used in gen_with_tools
is to make one call to extract the structured output (and call the tool), then update the system prompt to force use of the new context, then make a second call that includes the context alongside the original prompt to produce a response (with save_messages=False
). It seems this is a valid approach -- just adding "Context: <context> , User: <original prompt>"
as the user prompt in a second call to the model. I suppose I could do this manually as in gen_with_tools
, but having a simpler way to do this directly in AIChat would be helpful.
Maybe something like this would suffice for now (non-functional code, just sketching it out):
assistant = simpleaichat.AIChat(...)
prompt = <user input>
# Initial model call to get structured output/context
context = assistant(prompt, output_schema=..., save_messages=False)
# <call other functions to act on context>
# Create natural language response including context
prompt_with_context = f"Context: {context}\n\nUser: {prompt}"
response = assistant(prompt_with_context, save_messages=False)
# Save original message and final response
user_message = ChatMessage(role="user", content=prompt)
assistant_message = ChatMessage(role="assistant", content=response)
assistant.get_session().add_messages(user_message, assistant_message)
In OpenAI's official docs on function calling, they give a pattern of:
With simpleaichat's
AIChat
class, I am doing something that looks like:output_schema
At this point, I would like to call the model again as in Step 4 above, except with the structured data from
AIChat
'soutput_schema
as context, in order to produce a natural language response.So:
Is the preferred/recommended way to do this to using the
AIChat
class? Ideally I could do this in the very same conversation/instance rather than create a new conversation/instance to handle context.In the OpenAI example, they do the following:
Is it recommended to (for example) send a message to the model using the
function
role, even if no such functions were defined? In this case, I'm effectively using theAIChat
instance itself withoutput_schema
as a function. I would just like to get a natural language response in addition to the structuredoutput_schema
response.