BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.07k stars 1.66k forks source link

(fix) passthrough - allow internal users to access /anthropic #6843

Closed ishaan-jaff closed 13 hours ago

ishaan-jaff commented 1 day ago

Anthropic pass through was added without adding it to the list of llm api routes

Relevant issues

Type

๐Ÿ†• New Feature ๐Ÿ› Bug Fix ๐Ÿงน Refactoring ๐Ÿ“– Documentation ๐Ÿš„ Infrastructure โœ… Test

Changes

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

If UI changes, send a screenshot/GIF of working UI fixes

Description by Korbit AI

What change is being made?

Modify the is_llm_api_route function to allow internal users access to the /anthropic and /azure endpoints.

Why are these changes being made?

These changes are being made to ensure that internal users can access necessary API routes, specifically /anthropic and /azure, as these routes were previously restricted. This adjustment aligns with the need for internal operations to leverage these endpoints effectively.

Is this description stale? Ask me to generate a new description by commenting /korbit-generate-pr-description

vercel[bot] commented 1 day ago

The latest updates on your projects. Learn more about Vercel for Git โ†—๏ธŽ

Name Status Preview Comments Updated (UTC)
litellm โœ… Ready (Inspect) Visit Preview ๐Ÿ’ฌ Add feedback Nov 21, 2024 6:10am