Closed demux79 closed 6 months ago
hey @demux79 you can pass provider-specific params straight through - https://docs.litellm.ai/docs/completion/input#provider-specific-params
Are you seeing an error while doing this?
Thanks for pointing that out, very helpful. Would the --drop_params proxy setting interfere with this?
no - drop_params just drops the invalid openai params
closing for now - please reopen/bump me if it doesn't just work.
The Feature
Some embedding models allow you to add a task, e.g. classify or cluster to the call. It would be great if litellm could support such an additional parameter.
Example tasks for Vertex AI Embeddings: RETRIEVAL_QUERY Specifies the given text is a query in a search or retrieval setting. RETRIEVAL_DOCUMENT Specifies the given text is a document in a search or retrieval setting. SEMANTIC_SIMILARITY Specifies the given text is used for Semantic Textual Similarity (STS). CLASSIFICATION Specifies that the embedding is used for classification. CLUSTERING Specifies that the embedding is used for clustering. QUESTION_ANSWERING Specifies that the query embedding is used for answering questions. Use RETRIEVAL_DOCUMENT for the document side. FACT_VERIFICATION Specifies that the query embedding is used for fact verification. Source: https://cloud.google.com/vertex-ai/generative-ai/docs/embeddings/get-text-embeddings#api_changes_to_models_released_on_or_after_august_2023
Example tasks for Cohere Embeddings "search_document": Used for embeddings stored in a vector database for search use-cases. "search_query": Used for embeddings of search queries run against a vector DB to find relevant documents. "classification": Used for embeddings passed through a text classifier. "clustering": Used for the embeddings run through a clustering algorithm. Source: https://docs.cohere.com/reference/embed
Motivation, pitch
Defining a task improves specific task performance of embeddings for supported models.