canonicalmg / FAQ-Chat-Bot

Chat bot which can be used to answer FAQ questions. Uses cosine vector distance to estimate similarity of the user's question to the FAQ corpus
8 stars 3 forks source link

Integrate gpt_prompt function and update response handling in FAQ chatbot #11

Open repodex-app[bot] opened 1 year ago

repodex-app[bot] commented 1 year ago

This pull request aims to enhance the FAQ chatbot by integrating the GPT-3 model to generate more accurate and relevant responses to user queries. The following changes have been made to the code:

  1. Added the gpt_prompt function: This function takes a query as input and uses the OpenAI API to generate a response using the GPT-3 model. The function returns the generated response after stripping any unnecessary whitespace.

  2. Modified the allow_question function: The part of the code where the chatbot sends a response to the user has been updated. If the chatbot does not have a pre-built response for the user's query, it will now pass the response generated by the find_most_similar function through the gpt_prompt function before sending it to the user. This ensures that the chatbot's responses are more accurate and contextually relevant.

The new code is as follows:

def gpt_prompt(self, query):
    import openai
    openai.api_key = settings.OPENAI_API_KEY

    response = openai.ChatCompletion.create(
                model="gpt-3.5-turbo",
                messages=[query],
                temperature=1,
                n=1,
                stop=None,
            )
    return response.choices[0].message.content.strip()

def allow_question(self):
    # Check for event stack
    potential_event = None
    if(len(self.event_stack)):
        potential_event = self.event_stack.pop()
    if potential_event:
        text = raw_input("Response: ")
        potential_event.handle_response(text, self)
    else:
        text = raw_input("Question: ")
        answer = self.pre_built_responses_or_none(text)
        if not answer:
            answer = find_most_similar(text)
        answer = self.gpt_prompt(answer)
        self.answer_question(answer, text)

These changes are expected to improve the overall user experience and the quality of the chatbot's responses.