simonw / llm

Access large language models from the command-line
https://llm.datasette.io
Apache License 2.0
4.76k stars 262 forks source link

api_type = azure support for OpenAI custom models #178

Open simonw opened 1 year ago

simonw commented 1 year ago

https://twitter.com/simonw/status/1693706702519140571

Example code here says: https://learn.microsoft.com/en-us/azure/ai-services/openai/quickstart?tabs=command-line&pivots=programming-language-python

openai.api_key = os.getenv("AZURE_OPENAI_KEY")
openai.api_base = os.getenv("AZURE_OPENAI_ENDPOINT") # your endpoint should look like the following https://YOUR_RESOURCE_NAME.openai.azure.com/
openai.api_type = 'azure'
openai.api_version = '2023-05-15' # this may change in the future
simonw commented 1 year ago

Would be good to support a llm-openai-azure plugin.

That plugin could work today by setting the global openai.api_type = 'azure' variable, but I worry that would break other plugins that also use the openai library.

Instead, having a way to pass api_type to Chat here would be good: https://github.com/simonw/llm/blob/7744cf9b79121045d163d13075d2ec2261fc3472/llm/default_plugins/openai_models.py#L181-L183

simonw commented 1 year ago

Worth reviewing to see what other options might be useful. api_version looks like one.

simonw commented 1 year ago

Got it working!

git diff | llm -m azure-gpt4 -s 'explain this change'

This change in the Python file 'openai_models.py' expands the 'Chat' Model object and its associated functions to include more attributes: 'api_type', 'api_version', and 'api_engine'.

Initially, the 'Chat' Model only had the attributes 'model_id', 'key', 'model_name', 'api_base', and 'headers'. The update introduces 'api_type', 'api_version', and 'api_engine' to the initialisation function (init) and also makes necessary changes in the register function to accommodate these added attributes.

Also, additional error checks are added in the Class 'Chat'. If the api_type, api_version, and api_engine attributes are not null, it will add them to the keyword arguments.

This change likely allows for more specificity or functionality when using the 'Chat' model, perhaps allowing users to specify the type, version, and engine of the API in use.

That's with this in extra-openai-models.yml:

- model_id: azure-gpt4
  model_name: gpt4
  api_base: https://special-magic-secret-thing.openai.azure.com/
  api_key_name: azure
  api_version: 2023-03-15-preview
  api_type: azure
  api_engine: gpt4

And the branch I'm about to push.

kevinddnj commented 1 year ago

Simon I've implemented this, I just replaced the openai_models.py file in the latest distribution and configured the YAML file from your example. It works great and thank you for doing the implementation. It's a big deal for us as we deal with health data and need to run on a protected and firewalled instance like the Azure deployments. Do you intend to merge this soon?

dozsa commented 1 year ago

Any plan to merge this? This seems like an useful tool. But some are limited to use Azure OpenAI, so this would be needed. Thanks.

kevinddnj commented 1 year ago

Adrian I think it has been merged - at any rate I didn't need to monkey patch the v0.11.1 and v0.12 releases this week and I just inspected the current source for 0.12 and the support is there. So as long as you ensure the right entries are in the .yaml configuration file you should be OK.

I think Simon could close this Issue now! Kevin

dozsa commented 1 year ago

Was just reading the code and noticed the same. Tested it and works fine. Many thanks.

bnookala commented 1 year ago

Just to bring some extra closure to this, I wrote up some docs on how to use this feature. https://github.com/simonw/llm/pull/337