BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.53k stars 1.19k forks source link

[Feature]: Allow users to set `httpx` client when initializing LangfuseLogger #3567

Open ishaan-jaff opened 2 months ago

ishaan-jaff commented 2 months ago

The Feature

Support Langfuse httpx_client

class Langfuse(
    public_key: str | None = None,
    secret_key: str | None = None,
    host: str | None = None,
    release: str | None = None,
    debug: bool = False,
    threads: int = 1,
    flush_at: int = 15,
    flush_interval: float = 0.5,
    max_retries: int = 3,
    timeout: int = 10,
    sdk_integration: str | None = "default",
    httpx_client: Client | None = None
)

Motivation, pitch

-

Twitter / LinkedIn details

No response

Manouchehri commented 1 month ago

https://langfuse.com/docs/sdk/python/decorators#configure-the-langfuse-client