dreadnode / rigging

Lightweight LLM Interaction Framework
https://rigging.dreadnode.io
MIT License
180 stars 8 forks source link

[Feature Request] Support for passing `rg.Tool` objects to parse methods? #4

Open SJCaldwell opened 4 months ago

SJCaldwell commented 4 months ago

This is a matter of taste, but wondering if you'd be interested in a pr to parse rg.Tool calls from messages.

In continued chats with multiple tool calls, I usually provide a tool like

class FinalAnswer(rg.Tool):
    @property
    def name(self) -> str:
        return "final_answer"

    @property
    def description(self) -> str:
        return "A tool that ends the session with your final true or false answer to the question: 'blah blah"

And then continue chats until this tool call is used, breaking at that point. So it'd be nice to do something like

while True and not chat.last.parse(FinalAnswer):
     # keep doing stuff

Though that might require a new property for chat.last_user or chat.last_tool or something.

Also willing to believe there is a straightforward way of doing this that I'm just missing. Anyway, lemme know!

monoxgas commented 4 months ago

This is an interesting idea. I'll give some thought for how that would integrate into the library. I'd imagine the best alternative is to define a "FinalAnswer" model and use until_parsed_as along with using to let the model cycle tool calls until it's ready to provide an answer. We did something similar for our small CTF example:

import rigging as rg

class FinalAnswer(rg.Model):
   content: str

...

chat = generator.chat([
   {
   "role": "user",
   "content" : "[do some stuff]. When you are ready, place your final answer between {Answer.xml_example()} tags."
   }
]) \
.using(Tool) \
.until_parsed_as(FinalAnswer, max_rounds=100) \ # LLM will continue calling tools until this model is emitted
.run()

answer = chat.last.parse(FinalAnswer)