Prompty is an asset class and format for LLM prompts designed to enhance observability, understandability, and portability for developers. The primary goal is to accelerate the developer inner loop.
This repo contains the following:
This Visual Studio Code extension offers an intuitive prompt playground within VS Code to streamline the prompt engineering process. You can find the Prompty extension in the Visual Studio Code Marketplace.
Prompty standardizes prompts and their execution into a single asset.
Quickly create a basic prompty by right-clicking in the VS Code explorer and selecting "New Prompty."
Preview prompty similar to markdown with dynamic template rendering while typing, allowing you to see the prompt that will be sent to the model.
Define your model configurations directly in VS Code.
Quickly switch between different model configurations.
Use VS Code settings to define model configuration at:
We strongly encourage using Azure Active Directory authentication for enhanced security. Leave the api_key
empty to trigger AAD auth.
OpenAI is also supported. You can store the key in VSCode settings or use ${env:xxx}
to read the API key from an environment variable.
.env
file, in the same folder of the prompty file, or in the workspace root folder.Hit F5 or click the Run button at the top. There are two output windows:
Prompty Output shows a concise view.
Prompty Output (Verbose) shows detailed requests sent and received.
Prompty is supported by popular orchestration frameworks:
Right-click on a .prompty
file to quickly generate integration snippets.
Submit your feedback about Prompty or the VS Code extension to the Microsoft/prompty GitHub repository.