BuilderIO / ai-shell

A CLI that converts natural language to shell commands.
MIT License
3.92k stars 238 forks source link

Add Azure OpenAI support #77

Open fazedordecodigo opened 1 year ago

fazedordecodigo commented 1 year ago

Support for Azure OpenAI could be added, with this feature the data is not used for AI training, ideal for us to use at work, avoiding information leakage.

shikelong commented 1 year ago

+1.

wwydmanski commented 11 months ago

openai doesn't use API requests for AI training, but nevertheless Azure OpenAI could help in bringing down latency

akira-cn commented 10 months ago

I submit a pull request to add Azure OpenAI support: https://github.com/BuilderIO/ai-shell/pull/96

ishaan-jaff commented 10 months ago

Hi @Delatorrea @shikelong @wwydmanski I believe we can help with this issue. I’m the maintainer of LiteLLM https://github.com/BerriAI/litellm

TLDR: We allow you to use any LLM as a drop in replacement for gpt-3.5-turbo. If you don't have access to the LLM you can use the LiteLLM proxy to make requests to the LLM

You can use LiteLLM in the following ways:

With your own API KEY:

This calls the provider API directly

from litellm import completion
import os
## set ENV variables 
os.environ["OPENAI_API_KEY"] = "your-key" # 
os.environ["COHERE_API_KEY"] = "your-key" # 

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)

Using the LiteLLM Proxy with a LiteLLM Key

this is great if you don’t have access to claude but want to use the open source LiteLLM proxy to access claude

from litellm import completion
import os

## set ENV variables 
os.environ["OPENAI_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your openai key
os.environ["COHERE_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your cohere key

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)