blackstork-io / fabric

An open-source command-line tool for reporting workflow automation and a configuration language for reusable templates. Reporting-as-Code
https://blackstork.io/fabric/
Apache License 2.0
10 stars 0 forks source link

Remove implicit prepend of the data to a prompt in `content.openai_text` #170

Closed traut closed 1 week ago

traut commented 2 weeks ago

Description

Currently, content.openai_text content provider implicitly prepends the results of query JQ query to the prompt string. This is confusing and limits the flexibility of the prompt instructions

Design

Instead of pretending the data to the prompt, users must explicitly add .query_result (if they wish so) to the prompt template string.

To simplify serialisation, toJson or toPrettyJson (docs) template functions from sprig library can be used.

For example:

content openai_text {
  query = ".data.inline.facts"

  prompt = <<-EOT
     {{ toJson .query_result }}

     Describe the facts provided in the json object.
  EOT
}