microsoft / teams-ai

SDK focused on building AI based applications and extensions for Microsoft Teams and other Bot Framework channels
MIT License
395 stars 170 forks source link

[Bug]: Getting an error "The operation was canceled" from CompletePromptAsync method #1960

Open mukeshmrajput opened 2 weeks ago

mukeshmrajput commented 2 weeks ago

Language

C#

Version

latest

Description

We're calling "CompletePromptAsync" method manually from the incoming activities handles in the code. The problem is, we're having 9 static questions that we're sending to Azure Open AI ChatCompletiond API sequentially with prompt through the "CompletePromptAsync" method. Looks like first 3 to 4 question are being answered and rest all are fails with error message "The operation was canceled"

Reproduction Steps

1. Using dotnet sample (dotnet\samples\08.datasource.azureaisearch\AzureAISearchBot)
2. Updated Azure->OpenAIApiKey and Azure->OpenAIEndpoint in appsettings.json file
3. Changed model name to gpt-4o in prompt config file and updated data_sources to Vector Database index (Cognitive Search)
5. Program.cs: Added below line to listen incoming message handler.
   app.OnActivity(ActivityTypes.Message, activityHandlers.OnMessageActivityAsync);
6. In incoming message handler. Calling "CompletePromptAsync" method manually for static 9 questions in sequentially looping along with keep updating turnState.Temp!.Input to runtime question. 
7. We're seeing prompt response successful from "CompletePromptAsync" method for first 3 to 4 questions then rest all questions giving prompt response "Error" with "The operation was canceled".
8. Need suggestion or help why such error throwing? what's the fix for it
PjPraveenkumar commented 2 weeks ago

Hi @mukeshmrajput thanks for reaching out to us. I'll validate and get back to you.

PjPraveenkumar commented 2 weeks ago

hello @mukeshmrajput could you please share the code snippet for calling the "CompletePromptAsync" method manually for static 9 questions in sequentially looping along with keep updating turnState.Temp to validate it from our end? Additionally, please provide the ai module's console logging? When creating the app, your aiOptions should include logger factory: https://github.com/microsoft/teams-ai/blob/main/dotnet/packages/Microsoft.TeamsAI/Microsoft.TeamsAI/AI/Models/OpenAIModel.cs#L49

mukeshmrajput commented 2 weeks ago

Thanks @PjPraveenkumar for your support.

We're using dotnet sample (dotnet\samples\08.datasource.azureaisearch\AzureAISearchBot) with our own vector DataSource as RAG. You can paste attached files in this sample project and configure Azure Open AI Endpoint and Key in appsettings.json and try to reproduce the error.

Regarding console log, we're trying it to get for you and share here, meantime you can give try with attached files.

Thanks, Mukesh DotNetFiles.zip

mukeshmrajput commented 1 week ago

Hi @PjPraveenkumar

Attached the console logging of AI modules consol_log.txt for your reference

mukeshmrajput commented 1 week ago

Hi @PjPraveenkumar

Attached another log wherein if you look at ChatCompletionAPI call at line # 642, 692 and 732 never return back with response. Strange part in between all these line numbers are the line number 688 which telling sending FIN, it could be the problem for not returning response for the calls at mentioned lines

console_log_4Sept.txt

mukeshmrajput commented 1 week ago

Hi @PjPraveenkumar

I haven't heard back anything from you on this. Are you actively looking into this?

Thanks, Mukesh

PjPraveenkumar commented 2 days ago

Hi @mukeshmrajput Apologies for the delay. I’ve been unable to validate due to some access issue, which has prevented me from checking further. I’ll provide an update as soon as possible.