Azure-Samples / azure-search-openai-demo

A sample app for the Retrieval-Augmented Generation pattern running in Azure, using Azure AI Search for retrieval and Azure OpenAI large language models to power ChatGPT-style and Q&A experiences.
https://azure.microsoft.com/products/search
MIT License
5.91k stars 4.05k forks source link

Need to print out question and answer and user to log analytics #1822

Open rebeccaleeasml opened 2 months ago

rebeccaleeasml commented 2 months ago

Management wants to monitor on who use the GPT, and what questions they asked and what answers the GPT provided. Originally, we made custom code into this open source code to print these out. But after upgrading code to the open source code about a month ago, this code does not work anymore. Currently, I can get user and query, but no answer.

The code we added in, is in app/backend/app.py as shown below

def get_content_string_from_streamed_response(r: str) -> str: print("REBECCA DEBUG7 get_content_string_from_streamed_response r ", r, flush=True) logger = logging.getLogger(f'{CONFIG_LOGGER_NAME}.format_string') try: chunks = [json.loads(chunk) for chunk in r.replace('}\n{', '}^^^{').split('^^^')]

for chunk in chunks:

    #    print("REBECCA chunk ", chunk)
    content = [chunk.get('choices')[0].get('delta').get('content') for chunk in chunks if chunk.get('choices')[0].get('delta').get('content') is not None]
    return "".join(content)
except Exception as e:
    logger.exception("Exception while generating response string: %s", e)
    return json.dumps(error_dict(e))

@bp.route("/chat/stream", methods=["POST"]) @authenticated async def chat_stream(auth_claims: Dict[str, Any]): logger = logging.getLogger(f"{CONFIG_LOGGER_NAME}.chat") if not request.is_json: return jsonify({"error": "request must be json"}), 415 request_json = await request.get_json() print("REBECCA DEBUG1 request.headers", request.headers) context = request_json.get("context", {}) context["auth_claims"] = auth_claims auth_helper = current_app.config[CONFIG_AUTH_CLIENT] context['user'] = auth_helper.get_user_from_auth_header(request.headers) print("REBECCA DEBUG3 user ", context['user'], flush=True) try: use_gpt4v = context.get("overrides", {}).get("use_gpt4v", False) approach: Approach if use_gpt4v and CONFIG_CHAT_VISION_APPROACH in current_app.config: approach = cast(Approach, current_app.config[CONFIG_CHAT_VISION_APPROACH]) else: approach = cast(Approach, current_app.config[CONFIG_CHAT_APPROACH])

    result = await approach.run_stream(
        request_json["messages"],
        context=context,
        session_state=request_json.get("session_state"),
    )
    response = await make_response(format_as_ndjson(result))
    print("REBECCA DEBUG4.5 response ", response, flush=True)
    response.timeout = None  # type: ignore
    response.mimetype = "application/json-lines"
    print("REBECCA DEBUG4.6 response ", response, flush=True)
    #return response
    if isinstance(result, dict):
        logInfo = f"user={context['user']},query={request_json['messages'][-1].get('content')},answer={result.get('choices')[0].get('message').get('content')},context={context}"
        logger.log(CONFIG_APPLICATION_LOG_LEVEL,'/chat query and response: (%s)', logInfo)
        return jsonify(result)
    else:
        response = await make_response(format_as_ndjson(result))
        print("REBECCA DEBUG5 response ", response, flush=True)
        response.timeout = None  # type: ignore
        answer = get_content_string_from_streamed_response(await response.get_data(as_text=True))
        print("REBECCA DEBUG5.5 answer ", answer, flush=True)

        logInfo = f"user={context['user']},query={request_json['messages'][-1].get('content')},answer={answer},context={context}"
        logger.log(CONFIG_APPLICATION_LOG_LEVEL,'/chat query and response: (%s)', logInfo)
        return response
except Exception as error:
    return error_response(error, "/chat")

The error we receive is

ERROR:application.format_string:Exception while generating response string: 'NoneType' object is not subscriptable Traceback (most recent call last): File "...\app\backend\app.py", line 209, in get_content_string_from_streamed_response content = [chunk.get('choices')[0].get('delta').get('content') for chunk in chunks if chunk.get('choices')[0].get('delta').get('content') is not None] File "...\app\backend\app.py", line 209, in content = [chunk.get('choices')[0].get('delta').get('content') for chunk in chunks if chunk.get('choices')[0].get('delta').get('content') is not None] TypeError: 'NoneType' object is not subscriptable

when I print the debug code, for REBECCA DEBUG7 , I do not see any choices option.

Question: how can I get answer for each user's query or any place and write it into log analytics?

mattgotteiner commented 2 months ago

The response in streaming is generated asynchronously. I do not think your code will work. I'll take a look at how to do this.

rebeccaleeasml commented 2 months ago

Thank you, Matt

From: Matt @.> Sent: Monday, July 15, 2024 2:20 PM To: Azure-Samples/azure-search-openai-demo @.> Cc: Rebecca Lee @.>; Author @.> Subject: Re: [Azure-Samples/azure-search-openai-demo] Need to print out question and answer and user to log analytics (Issue #1822)

CAUTION: This message is from an external sender

The response in streaming is generated asynchronously. I do not think your code will work. I'll take a look at how to do this.

- Reply to this email directly, view it on GitHubhttps://github.com/Azure-Samples/azure-search-openai-demo/issues/1822#issuecomment-2229461302, or unsubscribehttps://github.com/notifications/unsubscribe-auth/A3NQOTWB3SHPHUZLWQS375TZMQ4JDAVCNFSM6AAAAABK5HK2TKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMRZGQ3DCMZQGI. You are receiving this because you authored the thread.Message ID: @.**@.>>

--- The information contained in this communication and any attachments is confidential and may be privileged, and is for the sole use of the intended recipient(s). Any unauthorized review, use, disclosure or distribution is prohibited. Unless explicitly stated otherwise in the body of this communication or the attachment thereto (if any), the information is provided on an AS-IS basis without any express or implied warranties or liabilities. To the extent you are relying on this information, you are doing so at your own risk. If you are not the intended recipient, please notify the sender immediately by replying to this message and destroy all copies of this message and any attachments. Neither the sender nor the company/group of companies he or she represents shall be liable for the proper and complete transmission of the information contained in this communication, or for any delay in its receipt.

mo-albaba commented 1 month ago

@mattgotteiner I'm also looking for a solution to log queries and answers from users for multiple helpful reasons, especially for a production environment. If you could provide any pointers or suggestions on how to do this, it would be greatly appreciated.

DuboisABB commented 1 month ago

This repo includes a chat history feature and also a user feedback feature requested in #489 (using CosmoDB) https://github.com/microsoft/sample-app-aoai-chatGPT It would be nice to integrate these features.