Closed barapa closed 4 months ago
Hi @barapa! We have seen OpenAI (and other provider calls) populate in the UI (there is a + icon for nested spans, and inside the nested structure is where the OpenAI call is).
I just tested this with the following FastAPI example:
import os
from typing import Type
import logfire
from fastapi import FastAPI
from pydantic import BaseModel
from mirascope.logfire import with_logfire
from mirascope.openai import OpenAIExtractor
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"
app = FastAPI()
logfire.configure()
logfire.instrument_fastapi(app)
class Book(BaseModel):
title: str
author: str
@with_logfire
class BookRecommender(OpenAIExtractor[Book]):
extract_schema: Type[Book] = Book
prompt_template = "Please recommend a {genre} book."
genre: str
@app.post("/")
async def root(book_recommender: BookRecommender) -> Book:
"""Generates a book based on provided `genre`."""
return await book_recommender.extract_async()
Ha yes, I didn't see the nested traces! Thank you.
In running an extraction, I don't see the assistant
box filled out:
Is that your experience when using an extractor as well?
I think this may be because the response is a function call.
OK, this is not a bug. Closing.
Oh yeah you're totally right, it's due to function calling.
I will look into how we might be able to better support this (although will likely require additional handling with logfire).
I believe they are releasing a customizable UI kit that we could use here, so will circle back when that is released.
For now the function calls should be present in the response data (just not pretty like the messages UI)
They are officially working on this now :)
In my terminal logs, I see
So it looks like that I coming from Mirascope. But in the logfire UI, I don't see anything re: OpenAI calls in the span:
Have you seen the OpenAI call details populate in the logfire UI? Not sure if this is a logfire issue or Mirascope yet.