Closed Sheikh-dev closed 5 months ago
Hi @Sheikh-dev apologies I missed commiting my changes. Could you delete the LlmsWithServerlessRagdevStack and ApiGwLlmsLambdadevStack cloudformation stacks and redeploy with the latest changes (git pull from the main branch).
I was working on making AOSS(Opensearch serverless) optional and also adding in Bedrock agents for function calling so the deployment process may have an extra step where you choose to implement RAG with AOSS or simply try out Bedrock. Bedrock-Agent integration is still pending and will take some time
Hi Sheikh,
I've fixed the issue, missed comitting my changes. Could you retry with a fresh deployment.
Thanks, Fraser
On Wed, 13 Mar, 2024, 7:47 pm Sheikh-dev, @.***> wrote:
getting this error
"{"success":false,"errorMessage":"Exception occured when querying LLM: An error occurred (validationException) when calling the InvokeModelWithResponseStream operation: "claude-3-sonnet-20240229" is not supported on this API. Please use the Messages API instead.","statusCode":"400"}"
— Reply to this email directly, view it on GitHub https://github.com/aws-samples/serverless-rag-demo/issues/76, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGFSSBBPULPA2BKXE2OYEADYYBNXPAVCNFSM6AAAAABEUIGH5OVHI2DSMVQWIX3LMV43ASLTON2WKOZSGE4DIMJTGEZTKNA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
getting this error
"{"success":false,"errorMessage":"Exception occured when querying LLM: An error occurred (validationException) when calling the InvokeModelWithResponseStream operation: \"claude-3-sonnet-20240229\" is not supported on this API. Please use the Messages API instead.","statusCode":"400"}"