BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.52k stars 1.46k forks source link

[Bug]: Langfuse logger doesn't work for tts models #4330

Closed Manouchehri closed 1 month ago

Manouchehri commented 3 months ago

What happened?

https://litellm.vercel.app/docs/text_to_speech

curl "$OPENAI_API_BASE/audio/speech" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "tts-1",
    "input": "The quick brown fox jumped over the lazy dog.",
    "voice": "alloy"
  }' -v | mpv -

Relevant log output

LiteLLM:ERROR: langfuse.py:232 - Langfuse Layer Error(): Exception occured - cannot access local variable 'output' where it is not associated with a value

Twitter / LinkedIn details

https://www.linkedin.com/in/davidmanouchehri/

krrishdholakia commented 2 months ago

how exactly would the output of tts be logged? @Manouchehri

it's speech right? so like a set of bytes?

krrishdholakia commented 2 months ago

bump on this? @Manouchehri

ishaan-jaff commented 1 month ago

this is fixed on latest - feel free to reopen if not @Manouchehri