BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.52k stars 1.46k forks source link

[Bug]: Embedding output missing from langfuse #4319

Closed Manouchehri closed 2 months ago

Manouchehri commented 3 months ago

What happened?

The output seems to always be null.

image
curl -v "$OPENAI_API_BASE/embeddings" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "input": "LiteLLM is cool.",
    "model": "text-embedding-3-small"
  }'

Relevant log output

model_list:
  - model_name: text-embedding-3-small
    litellm_params:
      model: azure/text-embedding-3-small
      api_version: "2024-05-01-preview"
      azure_ad_token: "oidc/google/https://example.com"
      user: litellm
      api_base: "https://gateway.ai.cloudflare.com/v1/account_id_removed/gateway_name_removed/azure-openai/example"

Twitter / LinkedIn details

https://www.linkedin.com/in/davidmanouchehri/

krrishdholakia commented 3 months ago

this isn't a bug. i believe this was a request from a user

relevant pr: https://github.com/BerriAI/litellm/pull/4226

given the context, what would be helpful to see here? @Manouchehri