MadcowD / ell

A language model programming library.
http://docs.ell.so/
MIT License
5.04k stars 295 forks source link

api_params and "tool_choice" #227

Closed JTCorrin closed 1 month ago

JTCorrin commented 1 month ago

Hey,

Just wanted to raise this issue i'm facing with this additional api_param (tool_choice) I am sending through:

@ell.complex(model="gpt-4o-mini", client=OpenAI(api_key=os.environ.get("OPENAI_API_KEY")), tools=[write_file], tool_choice="required")
def execute_step(instructions: str):
    ...[rest of code]

I'd expect this to force the tool call but that does not happen. Python's not my strong suit so I might be passing the parameter wrong? I've also tried passing this through via lm_params(dict(tool_choice="required")) with no luck.

Thanks for any assistance

MadcowD commented 1 month ago

it works!

ell.init(verbose=True, store=("./logdir"), autocommit=True)

@ell.tool()
def get_html_content(
    url : str = Field(description="The URL to get the HTML content of. Never incldue the protocol (like http:// or https://)"),
    ):
    """Get the HTML content of a URL."""
    response = requests.get("https://" + url)
    soup = BeautifulSoup(response.text, 'html.parser')
    # print(soup.get_text())~
    return soup.get_text()[:100]

@ell.complex(model="gpt-4o-mini", tools=[get_html_content], tool_choice="required")
def summarize_website(website :str) -> str:
    """Under no circumstances use the tool. Just guess."""
    return f"Tell me whats on {website}"

if __name__ == "__main__":
    output = summarize_website("langchains website")
    print(output)
    if output.tool_calls:
        tool_results = output.call_tools_and_collect_as_message()

        print(tool_results)

is working for me.

I think what you're not doing is calling the tool! Once the model says it wants to call a tool you get a tool call object:

 > output
 Message(content=ToolCall(params...))

You have to actually call it by either doing

>>> call = Message.tool_calls[0]
>>> call()
'<html > ..... '

Or you can package it up in a message to pass back to openai!

>>> response_message = output.call_tools_and_collect_as_message()
>>> print(response_message)
Message(contnet=[ContentBock(tool_result=ToolResult(content=json.dump (of whatever the tool call returend)
JTCorrin commented 3 weeks ago

Thanks for the response. This is still not working for me:

@ell.complex(model="gpt-4o-mini", client=OpenAI(api_key=os.environ.get("OPENAI_API_KEY")), tools=[write_file], tool_choice="required")
def execute_step(name: str, instructions: str, step: str, task: str):
  ...[code]
  return [
        ell.system(instructions + f"\n\nClarifications: {formatted_clarifications}"),
        ell.user(step)
    ]

Where this is called:

output = execute_step(agent.name, agent.instructions, step, task)

The print of the output var:

Message(role='assistant', content=[ContentBlock(text='After evaluating both genres...', image=None, audio=None, tool_call=None, parsed=None, tool_result=None)])

My expectation is there is a tool call here for me to 'execute' in the following code which comes directly after the output variable:

  if output.tool_calls:
        tool_message = output.call_tools_and_collect_as_message()
JTCorrin commented 3 weeks ago

image

Was hardcoded to "auto". Fixed in PR283