Eladlev / AutoPrompt

A framework for prompt tuning using Intent-based Prompt Calibration
Apache License 2.0
1.86k stars 149 forks source link

fix issues with Azure Openai endpoint #55

Closed vincilee2 closed 3 months ago

vincilee2 commented 3 months ago

I find some issues working with Azure OpenAI endpoint.

Issue 1

in the llm_env.yml, it's only allowed set one AZURE_OPENAI_ENDPOINT for azure type llm and not allowed to set openai_api_base.

this will causes:
- azure openai deployments may deployed within different azure endpoints. For example, gpt-4 are only available in some regions, and gpt-35-turbo in others.
- in langchain, vLLM Chat is supported via ChatOpenAI, the openai_base should be exposed for configuration.

Solution 1

By allowing set specific attributes in the config yml file, we give more control to the llm endpoints.

Issue 2

openai version in requiremnts.txt is not compatible with langchain when working at azure endpoint.

Solution

update openai package version to 1.1.0

Issue 3

there is no available parser if the llm type is not OpenAI in output_schemas of meta_prompts_classification. This causes the generated response unable to parsed when llm type is Azure.

Solution

Since Azure support the json schema, just leverage create_structured_output_runnable when llm type is OpenAI or Azure.