run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.49k stars 5.01k forks source link

[Question]: Persistent custom header/default_headers in AzureAICompletionsModel #15073

Open yvrohith opened 1 month ago

yvrohith commented 1 month ago

Question Validation

Question

Is there a way to add default_headers to AzureAICompletionsModel class? If not, how do I add persistent headers to API requests? including them in model_kwargs doesn't do the trick

dosubot[bot] commented 1 month ago

Hey @yvrohith! 👋 I'm here to help you out with any bugs, questions, or contributions you have. I'm a bot designed to assist you while waiting for a human maintainer.

Yes, you can add default_headers to the AzureAICompletionsModel similar to the OpenAI class. The AzureOpenAI class includes a default_headers parameter in its constructor and uses it in the _get_credential_kwargs method.

Here is the relevant part of the code that shows how default_headers is included:

def __init__(
        self,
        model: str = "gpt-35-turbo",
        engine: Optional[str] = None,
        temperature: float = 0.1,
        max_tokens: Optional[int] = None,
        additional_kwargs: Optional[Dict[str, Any]] = None,
        max_retries: int = 3,
        timeout: float = 60.0,
        reuse_client: bool = True,
        api_key: Optional[str] = None,
        api_version: Optional[str] = None,
        # azure specific
        azure_endpoint: Optional[str] = None,
        azure_deployment: Optional[str] = None,
        azure_ad_token_provider: Optional[AzureADTokenProvider] = None,
        use_azure_ad: bool = False,
        callback_manager: Optional[CallbackManager] = None,
        # aliases for engine
        deployment_name: Optional[str] = None,
        deployment_id: Optional[str] = None,
        deployment: Optional[str] = None,
        # custom httpx client
        http_client: Optional[httpx.Client] = None,
        async_http_client: Optional[httpx.AsyncClient] = None,
        # base class
        system_prompt: Optional[str] = None,
        messages_to_prompt: Optional[Callable[[Sequence[ChatMessage]], str]] = None,
        completion_to_prompt: Optional[Callable[[str], str]] = None,
        pydantic_program_mode: PydanticProgramMode = PydanticProgramMode.DEFAULT,
        output_parser: Optional[BaseOutputParser] = None,
        **kwargs: Any,
    ) -> None:
        engine = resolve_from_aliases(
            engine, deployment_name, deployment_id, deployment, azure_deployment
        )

        if engine is None:
            raise ValueError("You must specify an `engine` parameter.")

        azure_endpoint = get_from_param_or_env(
            "azure_endpoint", azure_endpoint, "AZURE_OPENAI_ENDPOINT", ""
        )
        super().__init__(
            engine=engine,
            model=model,
            temperature=temperature,
            max_tokens=max_tokens,
            additional_kwargs=additional_kwargs,
            max_retries=max_retries,
            timeout=timeout,
            reuse_client=reuse_client,
            api_key=api_key,
            azure_endpoint=azure_endpoint,
            azure_deployment=azure_deployment,
            azure_ad_token_provider=azure_ad_token_provider,
            use_azure_ad=use_azure_ad,
            api_version=api_version,
            callback_manager=callback_manager,
            http_client=http_client,
            async_http_client=async_http_client,
            system_prompt=system_prompt,
            messages_to_prompt=messages_to_prompt,
            completion_to_prompt=completion_to_prompt,
            pydantic_program_mode=pydantic_program_mode,
            output_parser=output_parser,
            **kwargs,
        )

def _get_credential_kwargs(
        self, is_async: bool = False, **kwargs: Any
    ) -> Dict[str, Any]:
        if self.use_azure_ad:
            self._azure_ad_token = refresh_openai_azuread_token(self._azure_ad_token)
            self.api_key = self._azure_ad_token.token
        else:
            import os

            self.api_key = self.api_key or os.getenv("AZURE_OPENAI_API_KEY")

        if self.api_key is None:
            raise ValueError(
                "You must set an `api_key` parameter. "
                "Alternatively, you can set the AZURE_OPENAI_API_KEY env var OR set `use_azure_ad=True`."
            )

        return {
            "api_key": self.api_key,
            "max_retries": self.max_retries,
            "timeout": self.timeout,
            "azure_endpoint": self.azure_endpoint,
            "azure_deployment": self.azure_deployment,
            "azure_ad_token_provider": self.azure_ad_token_provider,
            "api_version": self.api_version,
            "default_headers": self.default_headers,
            "http_client": self._async_http_client if is_async else self._http_client,
            **kwargs,
        }

This shows that you can pass default_headers when initializing the AzureOpenAI class, and these headers will be included in the API requests [1][2][3][4].

To continue talking to Dosu, mention @dosu.

yvrohith commented 1 month ago

I was referring to adding default_headers to the AzureAICompletionsModel class @dosu

dosubot[bot] commented 1 month ago

To add default_headers to the AzureAICompletionsModel class, you can follow a similar approach to the AzureOpenAI class. Here’s how you can do it:

  1. Add a default_headers parameter to the __init__ method.
  2. Set the default_headers attribute in the __init__ method.
  3. Ensure that the default_headers are included in the _get_credential_kwargs method.

Here is the modified code:

class AzureAICompletionsModel:
    def __init__(
        self,
        model: str = "gpt-4",
        engine: Optional[str] = None,
        temperature: float = 0.7,
        max_tokens: Optional[int] = 100,
        additional_kwargs: Optional[Dict[str, Any]] = None,
        max_retries: int = 3,
        timeout: float = 60.0,
        api_key: Optional[str] = None,
        api_base: Optional[str] = None,
        api_version: Optional[str] = None,
        azure_endpoint: Optional[str] = None,
        azure_deployment: Optional[str] = None,
        use_azure_ad: bool = False,
        default_headers: Optional[Dict[str, str]] = None,  # Add this line
        http_client: Optional[httpx.Client] = None,
        **kwargs: Any,
    ) -> None:
        self.model = model
        self.engine = engine
        self.temperature = temperature
        self.max_tokens = max_tokens
        self.additional_kwargs = additional_kwargs
        self.max_retries = max_retries
        self.timeout = timeout
        self.api_key = api_key
        self.api_base = api_base
        self.api_version = api_version
        self.azure_endpoint = azure_endpoint
        self.azure_deployment = azure_deployment
        self.use_azure_ad = use_azure_ad
        self.default_headers = default_headers  # Add this line
        self.http_client = http_client

    def _get_credential_kwargs(self, **kwargs: Any) -> Dict[str, Any]:
        if self.use_azure_ad:
            self._azure_ad_token = refresh_openai_azuread_token(self._azure_ad_token)
            self.api_key = self._azure_ad_token.token

        return {
            "api_key": self.api_key or None,
            "max_retries": self.max_retries,
            "azure_endpoint": self.azure_endpoint,
            "azure_deployment": self.azure_deployment,
            "api_version": self.api_version,
            "default_headers": self.default_headers,  # Ensure this line is present
            "http_client": self.http_client,
            "timeout": self.timeout,
        }

With these changes, you can now pass persistent headers when initializing the AzureAICompletionsModel class:

headers = {
    "Authorization": "Bearer YOUR_TOKEN",
    "Custom-Header": "CustomValue"
}

azure_ai = AzureAICompletionsModel(
    model="gpt-4",
    engine="your-engine",
    api_key="your-api-key",
    azure_endpoint="https://your-resource-name.openai.azure.com/",
    default_headers=headers
)

This will ensure that the specified headers are included in all API requests made by the AzureAICompletionsModel class [1][2][3][4].

To continue talking to Dosu, mention @dosu.