rudolfolah / chaincrafter

:hammer: Seamless integration and composability for large language model apps :paperclip:
https://rudolfolah.github.io/chaincrafter/
MIT License
7 stars 0 forks source link

Load chains and prompts from YAML using Catalogs #11

Closed rudolfolah closed 11 months ago

rudolfolah commented 11 months ago

Usage

from chaincrafter import Catalog
# Load the catalog
catalog = Catalog()
path = os.path.dirname(__file__)
catalog.load(os.path.join(path, "catalog.yml"))
print(catalog["hello_world_prompt"])
print(catalog.get_prompt("hello_world_prompt"])
# Run the chain as usual
catalog.get_chain("example_chain").run(chat_model, {"input": "var", ...})

# Example of providing overrides for prompt modifiers and input vars transformers
followup_question_prompt = catalog.get_prompt(
    "followup_question",
    [
        response_style("a pirate who has just returned from the Galapagos Islands"),
        response_length("short", "4-5 sentences"),
    ],
    # Parses and extracts data from the previous response to populate the input variable that is used by the prompt
    facts_list=lambda facts_list: extract_items_from_list(facts_list)[0],
)

Catalog Format YAML

YAML file format for Catalogs:

prompts:
  helpful_assistant: You are a helpful assistant
  initial_code_review: |
    I just put a lot of work into this code, can you please review it and offer suggestions for improvement? Do not write any code.
{diff}
```

city_population: "{city} sounds like a nice place to visit. What is the population of {city}?" chains: example: