wch / chatstream

Example Shiny for Python app which talks to the OpenAI API
https://wch.github.io/chatstream/
MIT License
74 stars 14 forks source link

Add Azure OpenAI support #10

Open wch opened 1 year ago

wch commented 1 year ago

This adds a new endpoint_type parameter to chat_server. To use with Azure:

    chatstream.chat_server("mychat", endpoint_type="azure")

Note that to use Azure, you will also probably need to set some values at the top of your app, as described in https://learn.microsoft.com/en-us/azure/cognitive-services/openai/how-to/switching-endpoints :

import openai

openai.api_type = "azure"
openai.api_key = "..."
openai.api_base = "https://example-endpoint.openai.azure.com"
openai.api_version = "2023-05-15"  # subject to change

I currently do not have access to Azure OpenAI, so I can't test this myself. It can be installed with:

pip install chatstream@git+https://github.com/wch/chatstream.git@azure
wch commented 1 year ago

I have this working with Azure, using a deployment I created named my-gpt-35-turbo.

Example app:

import openai
from shiny import App, Inputs, Outputs, Session, ui

import chatstream

openai.api_type = "azure"
openai.api_base = "https://winstontest.openai.azure.com/"
openai.api_version = "2023-03-15-preview"

app_ui = ui.page_fixed(
    chatstream.chat_ui("mychat"),
)

def server(input: Inputs, output: Outputs, session: Session):
    chatstream.chat_server(
        "mychat",
        api_key=os.getenv("OPENAI_API_KEY"),
        model="gpt-3.5-turbo",
        azure_deployment_id="my-gpt-35-turbo",
        debug=True,
    )

app = App(app_ui, server)

Note that you must create a deployment with a custom name. In this case, it is "my-gpt-35-turbo", but you could call it anything. The screen to create it in the Azure control panel will look something like this:

image

Also note that this is created within an Azure resource named winstontest. The URL for that resource is https://winstontest.openai.azure.com/.