Closed ishaan-jaff closed 1 week ago
litellm.embedding(model="vertex_ai/<your-model-id>", input)
model_list: - model_name: snowflake-arctic-embed-m-long-1731622468876 litellm_params: model: vertex_ai/<your-model-id> vertex_project: "adroit-crow-413218" vertex_location: "us-central1" vertex_credentials: adroit-crow-413218-a956eef1a2a8.json
litellm_settings: drop_params: True
2. Start Proxy
$ litellm --config /path/to/config.yaml
3. Make Request using OpenAI Python SDK, Langchain Python SDK ```python import openai client = openai.OpenAI(api_key="sk-1234", base_url="http://0.0.0.0:4000") response = client.embeddings.create( model="snowflake-arctic-embed-m-long-1731622468876", input = ["good morning from litellm", "this is another item"], ) print(response)
π New Feature π Bug Fix π§Ή Refactoring π Documentation π Infrastructure β Test
If UI changes, send a screenshot/GIF of working UI fixes
The latest updates on your projects. Learn more about Vercel for Git βοΈ
Add support for fine tuned embedding models
Proxy Usage
litellm_settings: drop_params: True
$ litellm --config /path/to/config.yaml
Relevant issues
Type
π New Feature π Bug Fix π§Ή Refactoring π Documentation π Infrastructure β Test
Changes
[REQUIRED] Testing - Attach a screenshot of any new tests passing locall
If UI changes, send a screenshot/GIF of working UI fixes