posit-dev / py-shiny

Shiny for Python
https://shiny.posit.co/py/
MIT License
1.32k stars 81 forks source link

Add support for Anthropic prompt caching #1755

Open wch opened 2 weeks ago

wch commented 2 weeks ago

This change adds support for Anthopic's beta prompt caching feature.

cpsievert commented 2 weeks ago

Here's a minimal example derived https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching

from anthropic import AsyncAnthropic
from app_utils import load_dotenv

from shiny.express import ui

load_dotenv()
llm = AsyncAnthropic()

chat = ui.Chat(id="chat")
chat.ui()
chat.update_user_input(value="Analyze the major themes in 'Pride and Prejudice'")

@chat.on_user_submit
async def _():
    response = await llm.beta.prompt_caching.messages.create(
        model="claude-3-5-sonnet-20241022",
        max_tokens=1024,
        stream=True,
        system=[
            {
                "type": "text",
                "text": "You are an AI assistant tasked with analyzing literary works. Your goal is to provide insightful commentary on themes, characters, and writing style.\n",
            },
            {
                "type": "text",
                "text": "<the entire contents of 'Pride and Prejudice'>",
                "cache_control": {"type": "ephemeral"},
            },
        ],
        messages=[{"role": "user", "content": chat.user_input()}],
    )
    await chat.append_message_stream(response)