Open simonw opened 1 year ago
Here's an experimental .chain()
method on Model
which I'm about to remove - I think this should happen on Conversation
from #85 instead.
https://github.com/simonw/llm/blob/b38b8314b980cbf4b1d2034809faffc2437ed608/llm/models.py#L217-L243
My latest thinking on this is that it can be part of the conversations mechanism. I'm thinking something like this:
def search_wikipedia(q: str) -> str:
"Search for wikipedia articles matching q and return their summary"
...
model = llm.get_model("4")
conversation = model.conversation(functions=[search_wikipedia])
chain_response = conversation.chain("Tell me about Agatha Christie")
print(chain_response.last.text())
So .conversation()
on the OpenAI Chat
model class takes an extra functions=
argument.
And .chain()
takes a prompt and executes multiple prompts as part of handling those functions.
I'm very undecided about the .chain()
mechanism so far though.
This is what a functions call partially looks like, from https://til.simonwillison.net/gpt3/openai-python-functions-data-extraction
completion = openai.ChatCompletion.create(
model="gpt-3.5-turbo-0613",
messages=[{"role": "user", "content": "I went to London and then stopped in Istanbul and Utrecht."}],
functions=[
{
"name": "extract_locations",
"description": "Extract all locations mentioned in the text",
"parameters": {
"type": "object",
"properties": {
"locations": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {
"type": "string",
"description": "The name of the location"
},
"country_iso_alpha2": {
"type": "string",
"description": "The ISO alpha-2 code of the country where the location is situated"
}
},
"required": ["name", "country_iso_alpha2"]
}
}
},
"required": ["locations"],
},
},
],
function_call={"name": "extract_locations"}
)
choice = completion.choices[0]
encoded_data = choice.message.function_call.arguments
print(json.dumps(json.loads(encoded_data), indent=4))
Here's the code where that would need to happen, currently buried deep in the .execute()
method of the OpenAI default plugin: https://github.com/simonw/llm/blob/cb41409e2b499b76ec37db178c3b0fc042d59129/llm/default_plugins/openai_models.py#L205-L227
I think functions=
and function_call=
become optional parameters on the Chat.execute(...)
method there. Then they get called by a custom Conversation.chain()
method.
Originally posted by @simonw in https://github.com/simonw/llm/issues/85#issuecomment-1630178102