BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.17k stars 1.41k forks source link

[Feature]: It seems we can't rely on securitygroups configured on the Azure Web App via Enterprise App autorization #4768

Closed fransbe closed 2 weeks ago

fransbe commented 1 month ago

The Feature

Fully rely on Entra ID security

Motivation, pitch

There is now not enough granularity besides the Entra ID Token.

Twitter / LinkedIn details

No response

krrishdholakia commented 1 month ago

hey @fransbe can you share what the gap is / how you'd expect this to work?

Happy to do a call to understand your setup, if that helps

https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Manouchehri commented 1 month ago

This is doable already with OIDC. My team has been using this for several weeks in prod.

See https://github.com/BerriAI/litellm/pull/4836.

ishaan-jaff commented 2 weeks ago

@fransbe you can also use Entrata ID now: https://docs.litellm.ai/docs/providers/azure#usage---litellm-proxy-server

Screenshot 2024-08-23 at 6 23 01 PM
ishaan-jaff commented 2 weeks ago

closing since we support Entrata ID for Azure OpenAI