griptape-ai / griptape

Modular Python framework for AI agents and workflows with chain-of-thought reasoning, tools, and memory.
https://www.griptape.ai
Apache License 2.0
1.77k stars 141 forks source link

Use rich.print in Chat utility #746

Open shhlife opened 2 months ago

shhlife commented 2 months ago

Describe the solution you'd like I would like to use the Rich library in the Chat utility to make it easier to discern the output between the Agent and the user. We currently have the ability to modify the intro_text, prompt_prefix, processing_text, and response_prefix - but if the user includes some of the nice formatting of the Rich library, those aren't taken into account.

For example:

Chat(
    agent,
    intro_text="\n[red]Welcome to Griptape Chat![/red]\n",
    prompt_prefix="\n[steel_blue3]You",
    processing_text="\n[aquamarine3]:thought_balloon: Thinking...",
    response_prefix="\n[pink3]Agent: ",
).start()

doesn't use the colors when executing the code.

Describe alternatives you've considered I have tried creating my own formatted response, which will work for some of the responses - but using the rich library inside the Chat utility would make this much nicer.

Adding this line at the start of chat.py helps:

from rich import print as print

Making these changes would allow the user to have their output look something like:

image

michal-repo commented 2 months ago

I guess you can already do this using output_fn callback.

from rich import print as rich_print

Chat(
    structure=agent,
    intro_text="\n[red]Welcome to Griptape Chat![/red]\n",
    prompt_prefix="\n[steel_blue3]You: ",
    processing_text="\n[aquamarine3]:thought_balloon: Thinking...",
    response_prefix="\n[pink3]Agent: ",
    output_fn=rich_print,
).start()

screenie

collindutter commented 2 months ago

@shhlife is the solution proposed by @michal-repo sufficient? If we move rich into Chat it may be difficult to land on a good set of defaults. I think I would prefer to keep Chat relatively simple and then users can customize.

Open to discussion though, I haven't used rich enough to know whats possible with minimal configuration.

shhlife commented 1 month ago

@collindutter - sorry, just saw this reply! :)

i've done some work using output_fn, but it doesn't work quite right.

notice how the prompt_prefix isn't doing the right thing, and I found the output was doing newlines:

image

michal-repo commented 1 month ago

@shhlife prompt_prefix issue happens because of input(self.prompt_prefix) input is doing default builtin print.

To solve this input issue you can introduce another Callable on Chat util, eg. input_fn and use rich.console in your custom code like so:

from rich.console import Console

console = Console()

Chat(
    structure=agent,
    intro_text="\n[red]Welcome to Griptape Chat![/red]\n",
    prompt_prefix="\n[steel_blue3]You: ",
    processing_text="\n[aquamarine3]:thought_balloon: Thinking...",
    response_prefix="\n[pink3]Agent: ",
    input_fn=console.input,
).start()

Newlines I guess are result of stream, this is solved by default here. So you can do similar thing with rich.

from rich import print as rich_print

def rich_printer(text: str):
    if agent.config.global_drivers.prompt_driver.stream:
        rich_print(text, end="", flush=True)
    else:
        rich_print(text)

Chat(
    structure=agent,
    intro_text="\n[red]Welcome to Griptape Chat![/red]\n",
    prompt_prefix="\n[steel_blue3]You: ",
    processing_text="\n[aquamarine3]:thought_balloon: Thinking...",
    response_prefix="\n[pink3]Agent: ",
    output_fn=rich_printer,
).start()
shhlife commented 1 month ago

Hey @michal-repo - thanks for the response!

yes, I could do that - but I think once we get to this point, I'd rather have this functionality built into the Chat utility so it's reusable elsewhere. :)