MadcowD / ell

A language model programming library.
http://docs.ell.so/
MIT License
5.3k stars 309 forks source link

Tool calls break along with response format #232

Open isacarnekvist opened 1 month ago

isacarnekvist commented 1 month ago

If I provide both tools and a pydantic model as response_format, the program breaks with

Traceback (most recent call last):
  File "/Users/isacarnekvist/workspace/ell/test.py", line 34, in <module>
    message = f(messages)
              ^^^^^^^^^^^
  File "/Users/isacarnekvist/workspace/ell/src/ell/lmp/_track.py", line 118, in tracked_func
    else func_to_track(*fn_args, _invocation_origin=invocation_id, **fn_kwargs, )
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/isacarnekvist/workspace/ell/src/ell/lmp/complex.py", line 72, in model_call
    (result, final_api_params, metadata) = provider.call(ell_call, origin_id=_invocation_origin, logger=_logger if should_log else None)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/isacarnekvist/workspace/ell/src/ell/provider.py", line 127, in call
    provider_resp = call(**final_api_call_params)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/isacarnekvist/Library/Caches/pypoetry/virtualenvs/ell-ai-obRGDTd7-py3.12/lib/python3.12/site-packages/openai/resources/beta/chat/completions.py", line 109, in parse
    _validate_input_tools(tools)
  File "/Users/isacarnekvist/Library/Caches/pypoetry/virtualenvs/ell-ai-obRGDTd7-py3.12/lib/python3.12/site-packages/openai/lib/_parsing/_completions.py", line 53, in validate_input_tools
    raise ValueError(
ValueError: `get_user_name` is not strict. Only `strict` function tools can be auto-parsed

Code to reproduce the issue:

import ell

from pydantic import BaseModel

class User(BaseModel):
    name: str
    favorite_ice_cream_flavor: str

@ell.tool()
def get_user_name():
    return "Isac"

@ell.tool()
def get_ice_cream_flavors():
    return ["Vanilla", "Strawberry", "Coconut"]

@ell.complex(model="gpt-4o", tools=[get_user_name, get_ice_cream_flavors], response_format=User)
def f(message_history: list[ell.Message]) -> list[ell.Message]:
    return [
        ell.system(
            "You are a helpful assistant that greets the user and asks them what ice cream flavor they want amongst those available. Call one tool at a time"
        )
    ] + message_history

if __name__ == "__main__":
    ell.init("ell-logs/")
    messages = []
    while True:
        message = f(messages)
        messages.append(message)

        print("message.tool_calls:", message.tool_calls)
        if message.tool_calls:
            tool_call_response = message.call_tools_and_collect_as_message(
                parallel=True, max_workers=2
            )
            print("tool_call_response:", tool_call_response)
            messages.append(tool_call_response)
        else:
            break

    for message in messages:
        print()
        print(message)
MadcowD commented 1 month ago

I wasn't aware tool calling was supported along with thsi. let me see what I can do!

isacarnekvist commented 1 month ago

I might be misunderstanding how to use ell as well of course, but maybe it’s clear what I’m trying to achieve?

I don’t want to enforce the schema on the responses from the tools, or the calls to the tools, but from the final output from the llm. You might have intended another usage pattern here?

RoyNijhuis commented 1 month ago

I am facing the exact same issue. I want the final response from the LLM to be formatted, but the intermediate tool calls do not need to adhere to this format.

MadcowD commented 1 month ago

Oh I see one sec. Will push a fix

On Wed, Oct 2, 2024 at 9:53 AM Roy Nijhuis @.***> wrote:

I am facing the exact same issue. I want the final response from the LLM to be formatted, but the intermediate tool calls do not need to adhere to this format.

— Reply to this email directly, view it on GitHub https://github.com/MadcowD/ell/issues/232#issuecomment-2389152219, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAFPVL6DTVGMHBLDHYVF7XTZZQQIRAVCNFSM6AAAAABOX52JA6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGOBZGE2TEMRRHE . You are receiving this because you commented.Message ID: @.***>