Open NotBioWaste905 opened 3 months ago
I got an idea for more complex prompts: we can allow passing responses as prompts instead of just strings.
And then it'd be possible to incorporate slots into a prompt:
model = LLM_API(prompt=rsp.slots.FilledTemplate("You are an experienced barista in a local coffeshop."
"Answer your customers questions about coffee and barista work.\n"
"Customer data:\nAge {person.age}\nGender: {person.gender}\nFavorite drink: {person.habits.drink}"
))
Description
Added functionality for calling LLMs via langchain API for utilizing them in responses and conditions.
Checklist
List here tasks to complete in order to mark this PR as ready for review.
To Consider