pydantic / logfire

Uncomplicated Observability for Python and beyond! 🪵🔥
https://logfire.pydantic.dev/docs/
MIT License
2.1k stars 62 forks source link

LLM tool call not shown for streamed response #542

Open jackmpcollins opened 7 hours ago

jackmpcollins commented 7 hours ago

Description

The Logfire UI nicely shows the tool call by an LLM for non-streamed responses

image

But for streamed responses the Assistant box is empty.

image

Code to reproduce

# Test logfire streamed responsea

from openai import Client

import logfire

logfire.configure()
logfire.instrument_openai()

client = Client()

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Create a Superhero named Monkey Boy."}],
    stream=True,
    stream_options={"include_usage": True},
    tool_choice={"type": "function", "function": {"name": "return_superhero"}},
    tools=[
        {
            "type": "function",
            "function": {
                "name": "return_superhero",
                "parameters": {
                    "properties": {
                        "name": {"title": "Name", "type": "string"},
                        "age": {"title": "Age", "type": "integer"},
                        "power": {"title": "Power", "type": "string"},
                        "enemies": {
                            "items": {"type": "string"},
                            "title": "Enemies",
                            "type": "array",
                        },
                    },
                    "required": ["name", "age", "power", "enemies"],
                    "type": "object",
                },
            },
        },
    ],
)
for chunk in response:
    print(chunk)

Related (closed) issue: https://github.com/pydantic/logfire/issues/54

Python, Logfire & OS Versions, related packages (not required)

logfire="0.50.1"
platform="macOS-15.0.1-arm64-arm-64bit"
python="3.10.12 (main, Jul 15 2023, 09:54:16) [Clang 14.0.3 
(clang-1403.0.22.14.1)]"
[related_packages]
requests="2.32.3"
pydantic="2.8.2"
openai="1.52.0"
protobuf="4.25.3"
rich="13.7.1"
tomli="2.0.1"
executing="2.0.1"
opentelemetry-api="1.25.0"
opentelemetry-exporter-otlp-proto-common="1.25.0"
opentelemetry-exporter-otlp-proto-http="1.25.0"
opentelemetry-instrumentation="0.46b0"
opentelemetry-proto="1.25.0"
opentelemetry-sdk="1.25.0"
opentelemetry-semantic-conventions="0.46b0"
alexmojaki commented 5 hours ago

I'm AFK but this sounds familiar and I see an old logfire version, can you please check if this still happens with the latest versions of logfire and openai?