BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.16k stars 1.13k forks source link

[Feature]: Integrate Azure Monitor into LiteLLM #4453

Closed RyoYang closed 2 days ago

RyoYang commented 3 days ago

The Feature

As this document introduced: https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-enable?tabs=python image

Are we able to config azure-monitor in Litellm, like add connectionstring in configuration to collect all the traces data from python logging module and ingest data into azure monitor/application insight.

Motivation, pitch

Support azure monitor in LiteLLM

Twitter / LinkedIn details

No response

ishaan-jaff commented 2 days ago

This is OTEL compatible - we already have an OTEL collector https://docs.litellm.ai/docs/proxy/logging#logging-proxy-inputoutput-in-opentelemetry-format

@RyoYang let me know if this does not work for you

ishaan-jaff commented 2 days ago

feel free to re-open this issue too