posit-dev / py-shiny

Shiny for Python
https://shiny.posit.co/py/
MIT License
1.24k stars 70 forks source link

Add `shiny.ui.Chat` #1453

Closed cpsievert closed 3 months ago

cpsievert commented 3 months ago

Adds a shiny.ui.Chat class, designed to support any LLM provider (i.e., response assistant) of your choosing (e.g., OpenAI, Anthropic, Ollama, LangChain, Google, etc). To get started, consider this bare-bones Chat example that uses no provider at all. It just displays a starting message, and then for each input message, just adds "You said: " in it's response:

App code ```python from shiny.express import ui ui.page_opts(title="Hello Shiny Chat") # Create a chat instance, with an initial message chat = ui.Chat( id="chat", messages=[ {"content": "Hello! How can I help you today?", "role": "assistant"}, ], ) # Display the chat chat.ui() # Define a callback to run when the user submits a message @chat.on_user_submit async def _(): user_msg = chat.get_user_input() await chat.append_message(f"You said: {user_msg}") ```

Screenshot 2024-06-28 at 4 28 47 PM

An "actual" LLM-powered chat bot might look something more like this (this example requires an OpenAI API key to run). Note also that, by default, responses are interpreted as markdown strings, and code blocks are rendered with code highlighting + copy/paste:

App code ```python # ------------------------------------------------------------------------------------ # A basic Shiny Chat example powered by OpenAI via LangChain. # To run it, you'll need OpenAI API key. # To get one, follow the instructions at https://platform.openai.com/docs/quickstart # To use other providers/models via LangChain, see https://python.langchain.com/v0.1/docs/modules/model_io/chat/quick_start/ # ------------------------------------------------------------------------------------ import os from langchain_openai import ChatOpenAI from shiny.express import ui # Provide your API key here (or set the environment variable) llm = ChatOpenAI(api_key=os.environ.get("OPENAI_API_KEY")) # Set some Shiny page options ui.page_opts( title="Hello LangChain Chat Models", fillable=True, fillable_mobile=True, ) # Create and display an empty chat UI chat = ui.Chat(id="chat") chat.ui() # Define a callback to run when the user submits a message @chat.on_user_submit async def _(): # Get messages currently in the chat messages = chat.get_messages() # Create a response message stream response = llm.astream(messages) # Append the response stream into the chat await chat.append_message_stream(response) ```

Screenshot 2024-06-28 at 4 22 05 PM

A whole collection of other examples are available under the examples/chat directory. See the basic and enterprise sub-directories for getting started with various providers.

Follow up tasks