TensorOpsAI / LLMstudio

Framework to bring LLM applications to production
https://tensorops.ai
Mozilla Public License 2.0
261 stars 31 forks source link

can not send additional data relevant to client init cycle. #169

Open litalhalawi opened 1 day ago

litalhalawi commented 1 day ago

https://github.com/TensorOpsAI/LLMstudio/blob/3834975bd81851963574fac7acb682a9aec248a8/libs/core/llmstudio_core/providers/azure.py#L45

since this is the only way to create azure client - following constructor parameters cannot override: api_version: str | None = None, azure_endpoint: str | None = None, azure_deployment: str | None = None, api_key: str | None = None, azure_ad_token: str | None = None, azure_ad_token_provider: AzureADTokenProvider | None = None, organization: str | None = None, project: str | None = None, base_url: str | None = None, timeout: float | Timeout | None | NotGiven = NOT_GIVEN, max_retries: int = DEFAULT_MAX_RETRIES, default_headers: Mapping[str, str] | None = None, default_query: Mapping[str, object] | None = None, http_client: httpx.Client | None = None, _strict_response_validation: bool = False,

litalhalawi commented 23 hours ago

reproduce: create an azure client by using following :

self.llm = LLM( provider="azure", api_key=os.getenv("AZURE_API_KEY"), api_version=os.getenv("AZURE_API_VERSION"), api_endpoint=os.getenv("AZURE_API_ENDPOINT"), ) provider_registry gets us AzureProvider that creates the relevant azure client interface. the init of the azure client (AzureOpenAI, OpenAI) is not open to all fields we can pass to constructor.

version 1.0.0

expected: In order to get my api to work I need to use AzureOpenAI with a deployment. so I would like to pass the deployment name in constructor when I create the LLM object - as one of the keywords. LLMCore constructor. (when creatin provider class)

actual: the azure provider not getting parameters other than api_version, api_endpoint, api_key .