Closed shivangisharma991 closed 10 months ago
This appears to be caused by an incorrect setting in the content designer. Can you please check the setting you have for LLM_GENERATE_QUERY_MODEL_PARAMS
and verify it is a correct JSON object as specified in the README.
Hi Can you also please clarify the version of QnAbot that you are using? Which LLM are you using? The blog pointed to states "When you deploy QnABot, you can choose to automatically deploy an open-source LLM model (Falcon-40B-instruct) on an Amazon SageMaker endpoint." Is it what you are using or you are configuring your own LLM model via the plugin. Please also share which LLM model you are using and steps taken to set it up.
This appears to be caused by an incorrect setting in the content designer. Can you please check the setting you have for
LLM_GENERATE_QUERY_MODEL_PARAMS
and verify it is a correct JSON object as specified in the README.
one of the value was wrongly mentioned, after correcting I see some response coming now from QnABot, thanks for response. I will proceed further and validate other aspects.
That's good to hear. Good luck!
Describe the bug After setting up the QnABot as suggested in AWS Blog (https://aws.amazon.com/blogs/machine-learning/deploy-generative-ai-self-service-question-answering-using-the-qnabot-on-aws-solution-powered-by-amazon-lex-with-amazon-kendra-and-amazon-bedrock/), I am getting error when testing the Bot. After investigating I found the error is in the LLM Lambda function.
To Reproduce Steps to reproduce the behavior.
Expected behavior A clear and concise description of what you expected to happen.
Please complete the following information about the solution:
To get the version of the solution, you can look at the description of the created CloudFormation stack. For example, "(SO0189) QnABot [...] v0.0.1".
Screenshots If applicable, add screenshots to help explain your problem (please DO NOT include sensitive information).
Additional context Add any other context about the problem here.