canonicalmg / FAQ-Chat-Bot

Chat bot which can be used to answer FAQ questions. Uses cosine vector distance to estimate similarity of the user's question to the FAQ corpus
8 stars 3 forks source link

Integrate gpt_prompt function into FAQ chatbot code #10

Open repodex-app[bot] opened 1 year ago

repodex-app[bot] commented 1 year ago

In this pull request, we have made changes to the FAQ chatbot to improve the quality of its responses by integrating the gpt_prompt function. The purpose of these changes is to pass the chatbot's response through the GPT-3 model before sending it to the user, ensuring a more coherent and helpful response.

Here's a summary of the changes made:

  1. We have added the gpt_prompt function to the chatbot code. This function takes a query as input and uses the GPT-3 model to generate a response.

  2. In the answer_question method, we have modified the code to call the gpt_prompt function with the response before sending it to the user. This is done for both cases: when the answer's score is above the minimum threshold and when it's below the threshold.

  3. For the case when the answer's score is above the minimum threshold, we pass the answer through the gpt_prompt function and then print the best-fit question, score, and the GPT-3 generated response.

  4. For the case when the answer's score is below the minimum threshold, we pass a default message through the gpt_prompt function and then print the GPT-3 generated response. Additionally, we set off an event for corpus dump.

By making these changes, we aim to enhance the user experience by providing more accurate and helpful responses from the FAQ chatbot.