aws-samples / aws-genai-llm-chatbot

A modular and comprehensive solution to deploy a Multi-LLM and Multi-RAG powered chatbot (Amazon Bedrock, Anthropic, HuggingFace, OpenAI, Meta, AI21, Cohere, Mistral) using AWS CDK on AWS
https://aws-samples.github.io/aws-genai-llm-chatbot/
MIT No Attribution
1.08k stars 324 forks source link

Receiving error when using Playground #156

Closed kvcarteraws closed 12 months ago

kvcarteraws commented 12 months ago

Thanks so much for your help building this amazing service! Unfortunately, I am receiving an issue when running the playground for the first time. When I attempt any queries, I am receiving the error below:

<class 'ValueError'>:Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModelWithResponseStream operation: The requested operation is not recognized by the service.

Does anyone have suggestions on troubleshooting steps I might be able to try? Thanks!

bigadsoleiman commented 12 months ago

Have you requested access to base models? Also note that titan models are still in preview so once you ensured you requested access to base models you might want to switch over to Claude, Cohere or AI21 ones.

For base model access check out https://github.com/aws-samples/aws-genai-llm-chatbot#amazon-bedrock-requirements

kvcarteraws commented 12 months ago

Hi, I have requested access to the base models, but none of the models I'm selecting are working properly. Do you have any additional thoughts on how to troubleshoot this issue? Thanks!

kvcarteraws commented 12 months ago

Here is a screenshot of my current base model access that verifies I have access to each of the model types.

Model Access
kvcarteraws commented 12 months ago

I redeployed the CDK solution using the bedrock-runtime API endpoint during the npm run create portion of the setup, and this time the solution worked properly.