BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.38k stars 1.69k forks source link

(Feat) Add Vertex Model Garden llama 3.1 models #6763

Closed ishaan-jaff closed 1 week ago

ishaan-jaff commented 1 week ago

Using Model Garden

Almost all Vertex Model Garden models are OpenAI compatible.

| Property | Details | |----------|---------| | Provider Route | `vertex_ai/openai/{MODEL_ID}` | | Vertex Documentation | [Vertex Model Garden - OpenAI Chat Completions](https://github.com/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/community/model_garden/model_garden_gradio_streaming_chat_completions.ipynb), [Vertex Model Garden](https://cloud.google.com/model-garden?hl=en) | | Supported Operations | `/chat/completions`, `/embeddings` | ```python from litellm import completion import os ## set ENV variables os.environ["VERTEXAI_PROJECT"] = "hardy-device-38811" os.environ["VERTEXAI_LOCATION"] = "us-central1" response = completion( model="vertex_ai/openai/", messages=[{ "content": "Hello, how are you?","role": "user"}] ) ``` **1. Add to config** ```yaml model_list: - model_name: llama3-1-8b-instruct litellm_params: model: vertex_ai/openai/5464397967697903616 vertex_ai_project: "my-test-project" vertex_ai_location: "us-east-1" ``` **2. Start proxy** ```bash litellm --config /path/to/config.yaml # RUNNING at http://0.0.0.0:4000 ``` **3. Test it!** ```bash curl --location 'http://0.0.0.0:4000/chat/completions' \ --header 'Authorization: Bearer sk-1234' \ --header 'Content-Type: application/json' \ --data '{ "model": "llama3-1-8b-instruct", # ๐Ÿ‘ˆ the 'model_name' in config "messages": [ { "role": "user", "content": "what llm are you" } ], }' ``` ## Relevant issues ## Type ๐Ÿ†• New Feature ๐Ÿ› Bug Fix ๐Ÿงน Refactoring ๐Ÿ“– Documentation ๐Ÿš„ Infrastructure โœ… Test ## Changes ## [REQUIRED] Testing - Attach a screenshot of any new tests passing locall If UI changes, send a screenshot/GIF of working UI fixes
vercel[bot] commented 1 week ago

The latest updates on your projects. Learn more about Vercel for Git โ†—๏ธŽ

Name Status Preview Comments Updated (UTC)
litellm โœ… Ready (Inspect) Visit Preview ๐Ÿ’ฌ Add feedback Nov 15, 2024 8:55pm