aws-samples / aws-lex-conv-faq

Demonstration of LLM integration into a lex bot using Lambda codehooks and a Sagemaker endpoint.
MIT No Attribution
11 stars 6 forks source link

Interaction between lex-codehook-fn and Lex Test Chat #12

Open carbonless opened 7 months ago

carbonless commented 7 months ago

When I ask a question in Lex using the Test Chatbot - it sends the query to lex-codehook-fn and the Lambda function starts working. Using DEBUG I can see that is gets my question, produces a Prompt for the LLM and even returns the Top 5 internally. but for some reason (even when I make the threshold 0.1) I never get an answer returned?

query_engine = RetrieverQueryEngine(retriever=retriever, response_synthesizer=synth) query_input = event["inputTranscript"]

try:
    answer = query_engine.query(query_input)
    if answer.source_nodes[0].score < RETRIEVAL_THRESHOLD:
        answer = OUT_OF_DOMAIN_RESPONSE
except:
    answer = OUT_OF_DOMAIN_RESPONSE

In fact I changed the OUT_OF_DOMAIN_RESPONSE for try and except and what is happening is that the "try" is failing over to except. - this seems to mean that the model does not have confidence in the answers even though they are coming in with a confidence of .6 and higher?

For this reason I am NEVER getting a response in the Chatbot Test except '"I'm sorry, but I am only able to give responses regarding the source topic"

This inspite of the fact that the Function actually did work and has an 'answer' ???