Prompty makes it easy to create, manage, debug, and evaluate LLM prompts for your AI applications. Prompty is an asset class and format for LLM prompts designed to enhance observability, understandability, and portability for developers.
Guidance allows users to constrain generation (e.g. with regex and CFGs) as well as to interleave control (conditionals, loops, tool use) and generation for more predictable outputs.
It's been recently integrated to Phi-3.5-mini in AI Studio and is a powerful paradigm for latency and cost optimization that could also be used within Prompty for hardening and testing prompt controls.
Based on this example, the prompty file could have a guidance section:
guidance: ${file: guidance.json}
---
user:
"What is the capital of Australia?"
"The capital of Australia is {{reg1}}"
Guidance allows users to constrain generation (e.g. with regex and CFGs) as well as to interleave control (conditionals, loops, tool use) and generation for more predictable outputs.
It's been recently integrated to Phi-3.5-mini in AI Studio and is a powerful paradigm for latency and cost optimization that could also be used within Prompty for hardening and testing prompt controls.
Based on this example, the prompty file could have a
guidance
section:and the
guidance.json
file to contain: