aws-solutions / qnabot-on-aws

AWS QnABot is a multi-channel, multi-language conversational interface (chatbot) that responds to your customer's questions, answers, and feedback. The solution allows you to deploy a fully functional chatbot across multiple channels including chat, voice, SMS and Amazon Alexa.
https://aws.amazon.com/solutions/implementations/aws-qnabot
Apache License 2.0
391 stars 250 forks source link

Getting Syntax Error on LLM Model Lambda function #662

Closed shivangisharma991 closed 10 months ago

shivangisharma991 commented 10 months ago

Describe the bug After setting up the QnABot as suggested in AWS Blog (https://aws.amazon.com/blogs/machine-learning/deploy-generative-ai-self-service-question-answering-using-the-qnabot-on-aws-solution-powered-by-amazon-lex-with-amazon-kendra-and-amazon-bedrock/), I am getting error when testing the Bot. After investigating I found the error is in the LLM Lambda function.

To Reproduce Steps to reproduce the behavior.

Expected behavior A clear and concise description of what you expected to happen.

Please complete the following information about the solution:

To get the version of the solution, you can look at the description of the created CloudFormation stack. For example, "(SO0189) QnABot [...] v0.0.1".

Screenshots If applicable, add screenshots to help explain your problem (please DO NOT include sensitive information). Screenshot 2023-11-14 at 13 50 52 Screenshot 2023-11-14 at 14 25 14

Additional context Add any other context about the problem here.

marcburnie commented 10 months ago

This appears to be caused by an incorrect setting in the content designer. Can you please check the setting you have for LLM_GENERATE_QUERY_MODEL_PARAMS and verify it is a correct JSON object as specified in the README.

fhoueto-amz commented 10 months ago

Hi Can you also please clarify the version of QnAbot that you are using? Which LLM are you using? The blog pointed to states "When you deploy QnABot, you can choose to automatically deploy an open-source LLM model (Falcon-40B-instruct) on an Amazon SageMaker endpoint." Is it what you are using or you are configuring your own LLM model via the plugin. Please also share which LLM model you are using and steps taken to set it up.

shivangisharma991 commented 10 months ago

This appears to be caused by an incorrect setting in the content designer. Can you please check the setting you have for LLM_GENERATE_QUERY_MODEL_PARAMS and verify it is a correct JSON object as specified in the README.

one of the value was wrongly mentioned, after correcting I see some response coming now from QnABot, thanks for response. I will proceed further and validate other aspects.

marcburnie commented 10 months ago

That's good to hear. Good luck!