Closed ianarawjo closed 1 year ago
Partly solved by https://github.com/ianarawjo/ChainForge/commit/09871cdc1ff3a0449ae7bf073eb577b261a4fde7
But it would be nice to keep track of "vars" instead of flattening all responses in a list, so that chaining prompts together doesn't lose information about what generated the prompt. For instance, see the following image:
Notice we've lost track of the "tool" property in the outputs to the right. Ideally, we'd still have access to upstream vars.
This is now fixed —one has access to upstream vars by allowing prompt parameter values to be dicts of form {text, fill_history} and carrying over the fill_history to the new PromptTemplate.
I tried to do this but got an error. Am I doing it wrong? Here is a screenshot of the set up:
The error shown when I click the play button on my 'CleanUp' Prompt Node is:
Cannot send a prompt 'Analyse the table below of recommendations to improve the requirements of a user story. Identify any duplicates by using the first instance as the template and then adding any further detail from the copies.
Here is the table:
[
{
"Id": 1,
"Criteria": "<redacted>",
"Recommendation": "<redacted>",
"Justification": "<redacted>"
},
{
"Id": 2,
"Criteria": "<redacted>",
"Recommendation": "<redacted>",
"Justification": "<redacted>"
},
{
"Id": 3,
"Criteria": "<redacted>",
"Recommendation": "<redacted>",
"Justification": "<redacted>"
}
]' to LLM: Prompt is a template.
Also, is there a way to amalgamate all the results from my first Prompt Node so the second Prompt Node processes that?
Ah, it seems the issue is caused by the braces { } in the output —we’ll need to escape the braces in the output of prompt nodes. Good catch, I’ll work on this in a moment.
For the aggregation idea, can you open up a separate Issue? Thanks!
Okay, I've found the bug. Will push an update in a minute. Thanks for spotting this!
Any braces in the outputs of Prompt Nodes will now be escaped { and } by default, so there's no confusion with template variables. Note that all escaped braces { and } are replaced for { and } before being sent off as prompts to LLMs.
The bug is now patched in all versions of the app. Thanks again!
Add the ability to attach a
PromptNode
as the input to anotherPromptNode
.output
in store.js, you need to check if the input node of typeprompt
, and then instead of getting "fields" param in the props, you'd get the ID of the prompt and request the responses from the backend.