Closed giusedroid closed 4 months ago
Overall very useful change!!
<history> </history>
. Instead of that, we can prepare the list like this on the frontend side:[
{
role: "user",
content: [{ text: "first question" }],
},
{
role: "agent",
content: [{ text: "first answer" }],
},
{
role: "user",
content: [{ text: "second question" }],
},
{
role: "agent",
content: [{ text: "second answer" }],
},
];
And pass this array to the lambda.
const conversation = [
{
role: "system", <- BY defining this, LLM will understand that this is the ultimate task we want to achieve
content: [{ text: "prompt without any mention about history" }],
},
];
conversation = conversation.push(...history)
This approach will take advantage of the existing ability of the converse API instead of doing that work on our side.
ah interesting! I'll try to make these changes asap :D
bumping into this error: An error occurred while invoking the selected model. 1 validation error detected: Value 'system' at 'messages.1.member.role' failed to satisfy constraint: Member must satisfy enum value set: [user, assistant]
Having a look. Looks like system is not supproted 🤔 maybe I need to upgrade the bedrock sdk or something?
Implemented 😸
converse api supports system prompt as one of the argument. Neat!
Issue #35 , if available:
Description of changes:
Front-end, Back-end and prompt changes to enable users chat history management.
This only makes use of Local Storage.
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.