minimaxir / simpleaichat

Python package for easily interfacing with chat apps, with robust features and minimal code complexity.
MIT License
3.43k stars 224 forks source link

`AIChat` doesn't save messages if provided with an `output_schema` (despite setting `save_messages=True`) #30

Open keyboardAnt opened 1 year ago

keyboardAnt commented 1 year ago

and I'm able to reproduce the issue. Here's an example:

from simpleaichat import AIChat
from pydantic import BaseModel, Field
from typing import Literal

params = {
    "temperature": 0,
    "max_tokens": 100,
    "top_p": 1,
    "frequency_penalty": 0,
    "presence_penalty": 0,
}
ai = AIChat(api_key=api_key, console=False, model="gpt-3.5-turbo-0613", params=params, save_messages=True)

class get_weather(BaseModel):
    """
    Get the weather for a city.
    """
    city: str = Field(default="New York", title="City", description="City to get weather for")
    units: Literal["imperial", "metric"] = Field(default="metric", title="Units", description="Units of measurement")

response_structured = ai("What's the weather like in New York?", output_schema=get_weather)

Then, ai.get_session().messages == []. 🥲

(Note: This issue is with the latest commit, d8f04a5c0414356337beb0c392236ee48c38b865)

minimaxir commented 1 year ago

This is intentional. From the relase notes: https://github.com/minimaxir/simpleaichat/releases/tag/v0.2.0

In all cases, no messages are saved when using schema to prevent unintended behavior. You will have to manage the intermediate output yourself for the time being, if you want to chain inputs.

I need to do a lot more testing for testing if schema output can be used in future requests because OpenAI's documentation doesn't cover it.

keyboardAnt commented 1 year ago

This is intentional. From the relase notes: https://github.com/minimaxir/simpleaichat/releases/tag/v0.2.0

In all cases, no messages are saved when using schema to prevent unintended behavior. You will have to manage the intermediate output yourself for the time being, if you want to chain inputs.

I need to do a lot more testing for testing if schema output can be used in future requests because OpenAI's documentation doesn't cover it.

What do think about In [8] in their How_to_call_functions_with_chat_models.ipynb?

minimaxir commented 1 year ago

Hmm, fair counterpoint, I missed that implication from the example.

The concern would be then you would have to pass the same function/schema with every ChatGPT API call from then on for the rest of the session.

Maybe it's worth adding a functions field to ChatGPTSession to log the functions used as input_schema, although this potentially creates a new problem with session saving loading...