Closed kresimirfijacko closed 2 months ago
On a potentially related note, seeding a seed parameter for vertex/llama 3.1 405 errors out when with drop_params set to true. Same parameter works for azure / gpt-4 calls.
missed this @kresimirfijacko, i believe drop param support for embedding was added in a recent version - can you confirm this persists on latest?
Hi @kresimirfijacko unable to repro, this works for me as expected.
curl -L -X POST 'http://0.0.0.0:4000/embeddings' \
-H 'Authorization: Bearer sk-1234' \
-H 'Content-Type: application/json' \
-d '{"input": ["hello world"], "model": "embedding", "dimensions": 3}'
model_list:
- model_name: embedding
litellm_params:
model: bedrock/amazon.titan-embed-text-v1
drop_params: True
@srail vertex ai llama3.1 works with seed param (see below)
Vertex Llama3.1 is currently assumed to be completely openai compatible - https://github.com/BerriAI/litellm/blob/4626c5a365c725ed21292376bb1d9ba3a74fdfab/litellm/llms/vertex_ai_and_google_ai_studio/vertex_ai_partner_models/llama3/transformation.py#L52
as I can't find any docs on the complete list of vertex llama 3 param docs (e.g. their example on vertex shows max_tokens, but their playground has max_tokens, top_p, top_k and temperature)
Hi @kresimirfijacko , curious do you use LiteLLM today ? If so, I'd love to hop on a call and learn how we can improve LiteLLM for you
What happened?
Version: ghcr.io/berriai/litellm:main-v1.44.14
I have problem setting 'drop_params' on specific model, which should be supported by documentation https://docs.litellm.ai/docs/completion/drop_params
example proxy_config file: ` model_list:
litellm_settings: drop_params: True # THIS WORKS
`
Relevant log output
No response
Twitter / LinkedIn details
No response