Open litalhalawi opened 1 day ago
reproduce: create an azure client by using following :
self.llm = LLM( provider="azure", api_key=os.getenv("AZURE_API_KEY"), api_version=os.getenv("AZURE_API_VERSION"), api_endpoint=os.getenv("AZURE_API_ENDPOINT"), )
provider_registry
gets us AzureProvider
that creates the relevant azure client interface.
the init of the azure client (AzureOpenAI, OpenAI) is not open to all fields we can pass to constructor.
version 1.0.0
expected: In order to get my api to work I need to use AzureOpenAI with a deployment. so I would like to pass the deployment name in constructor when I create the LLM object - as one of the keywords. LLMCore constructor. (when creatin provider class)
actual: the azure provider not getting parameters other than api_version, api_endpoint, api_key .
https://github.com/TensorOpsAI/LLMstudio/blob/3834975bd81851963574fac7acb682a9aec248a8/libs/core/llmstudio_core/providers/azure.py#L45
since this is the only way to create azure client - following constructor parameters cannot override: api_version: str | None = None, azure_endpoint: str | None = None, azure_deployment: str | None = None, api_key: str | None = None, azure_ad_token: str | None = None, azure_ad_token_provider: AzureADTokenProvider | None = None, organization: str | None = None, project: str | None = None, base_url: str | None = None, timeout: float | Timeout | None | NotGiven = NOT_GIVEN, max_retries: int = DEFAULT_MAX_RETRIES, default_headers: Mapping[str, str] | None = None, default_query: Mapping[str, object] | None = None, http_client: httpx.Client | None = None, _strict_response_validation: bool = False,