htadashi / GPT3-AHK

An AutoHotKey script that enables you to use GPT3 in any input field on your computer
MIT License
110 stars 18 forks source link

Support for Azure OpenAI API #12

Closed lchyn closed 11 months ago

lchyn commented 1 year ago

Hello, Currently this AHK code can only work under the official OPENAI API; Could you please add support for OPENAI service in Microsoft AZURE? What about making the code work properly under the Azure OPENAI API (https://learn.microsoft.com/en-us/azure/ai-services/openai/reference)?

Thanks!

ishaan-jaff commented 1 year ago

Hi @lchyn @htadashi I believe we can help with this issue. I’m the maintainer of LiteLLM https://github.com/BerriAI/litellm

TLDR: We allow you to use any LLM as a drop in replacement for gpt-3.5-turbo. If you don't have access to the LLM you can use the LiteLLM proxy to make requests to the LLM

You can use LiteLLM in the following ways:

With your own API KEY:

This calls the provider API directly

from litellm import completion
import os
## set ENV variables 
os.environ["OPENAI_API_KEY"] = "your-key" # 
os.environ["COHERE_API_KEY"] = "your-key" # 

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)

Using the LiteLLM Proxy with a LiteLLM Key

this is great if you don’t have access to claude but want to use the open source LiteLLM proxy to access claude

from litellm import completion
import os

## set ENV variables 
os.environ["OPENAI_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your openai key
os.environ["COHERE_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your cohere key

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)
htadashi commented 11 months ago

Hi @lchyn and @ishaan-jaff.

Thanks for opening the issue. I followed @ishaan-jaff suggestion and added an option to call custom models compatible with OpenAI API standard in https://github.com/htadashi/GPT3-AHK/commit/c7de601d7417180d851190080048ba40ad6f74e5

In particular, you can run the litellm server in your computer following the quickstart instructions, and setup the CUSTOM_MODEL_ENDPOINT and CUSTOM_MODEL_ID to respectively point out to your server IP and to your desired Azure model name.