Integration-Automation / ReEdgeGPT

Microsoft's Bing Chat AI
MIT License
163 stars 27 forks source link

ERROR! InternalError: ServiceClient failure for ChatGpt. Cancellation not requested #126

Closed JayGarland closed 8 months ago

JayGarland commented 8 months ago

Always this error happens as soon as the bot searches internet.

prerequisite: I use a precontext as the webpage to give the bot a persona and chat with the bot.

InternalError: ServiceClient failure for ChatGpt. Cancellation not requested
Full exception: {'type': 2, 'invocationId': '3', 'item': {'firstNewMessageIndex': None, 'defaultChatName': None, 'conversationId': '51D|BingProd|B0B5A83E9645E4161B9AC6E71E477F0D1C4B53C2235AA5914D38B8298D876C6B', 'requestId': 'ca268465-a711-44a5-8d04-e0ff7d5a88aa', 'telemetry': {'startTime': '2024-03-01T13:50:22.9163858Z'}, 'result': {'value': 'InternalError', 'message': 'ServiceClient failure for ChatGpt. Cancellation not requested', 'error': 'ServiceClient failure for ChatGpt. Cancellation not requested\n ---> Failed to call "ChatGpt" at "https://deucalionv46-prod.northcentralus.inference.ml.azure.com/v1/engines/model/completions". HttpCode=NotFound - ResponseCode=Fail - LoadBalancerResponseCode=Success - AMLModelErrorStatusCode=-1 - ReasonPhrase=Not Found - AMLModelErrorReason= - ErrorMessage=No valid deployments to route to. Please check that the endpoint has at least one deployment with positive weight values or use a deployment specific header to route. [https://docs.microsoft.com/en-us/azure/machine-learning/how-to-safely-rollout-managed-endpoints#deploy-a-new-model-but-send-it-no-traffic-yet]\nPlease check this guide to understand why this error code might have been returned \nhttps://docs.microsoft.com/en-us/azure/machine-learning/how-to-troubleshoot-online-endpoints#http-status-codes\n.', 'exception': 'Microsoft.TuringBot.Common.ServiceClientException: ServiceClient failure for ChatGpt. Cancellation not requested\r\n ---> System.Net.Http.HttpRequestException: Failed to call "ChatGpt" at "https://deucalionv46-prod.northcentralus.inference.ml.azure.com/v1/engines/model/completions". HttpCode=NotFound - ResponseCode=Fail - LoadBalancerResponseCode=Success - AMLModelErrorStatusCode=-1 - ReasonPhrase=Not Found - AMLModelErrorReason= - ErrorMessage=No valid deployments to route to. Please check that the endpoint has at least one deployment with positive weight values or use a deployment specific header to route. [https://docs.microsoft.com/en-us/azure/machine-learning/how-to-safely-rollout-managed-endpoints#deploy-a-new-model-but-send-it-no-traffic-yet]\nPlease check this guide to understand why this error code might have been returned \nhttps://docs.microsoft.com/en-us/azure/machine-learning/how-to-troubleshoot-online-endpoints#http-status-codes\n.\r\n   --- End of inner exception stack trace ---\r\n   at BotClientLibrary.ServiceClients.ServiceClient.SendPapyrusRequest(String url, HttpRequestMessage request, TelemetryScope scope, MetricsCollection metrics, CancellationToken cancellationToken, String serviceName, ServiceClientOptions options) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\ServiceClients\\ServiceClient.cs:line 816\r\n   at BotClientLibrary.ServiceClients.ServiceClient.Run(Conversation conversation, Message message, CancellationToken cancellationToken, BatchRequest batchRequest, ServiceClientOptions options) 
in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\ServiceClients\\ServiceClient.cs:line 359\r\n   at BotClientLibrary.ServiceClients.ContentProviders.LLM.DeepLeoStreamer.Stream(String modelOutputPrefix, Conversation deepLeoConversation, Message message, DeepLeoConfiguration modelConfig, CancellationToken cancellationToken)+MoveNext() in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\ServiceClients\\ContentProviders\\LLM\\DeepLeoStreamingClient.cs:line 309\r\n   at BotClientLibrary.ServiceClients.ContentProviders.LLM.DeepLeoStreamer.Stream(String modelOutputPrefix, Conversation deepLeoConversation, Message message, DeepLeoConfiguration modelConfig, CancellationToken cancellationToken)+System.Threading.Tasks.Sources.IValueTaskSource<System.Boolean>.GetResult()\r\n   at BotClientLibrary.Extensions.DeepLeoOrchestrator.GetTokens(Boolean isStreamingEnabled, Conversation deepLeoConversation, Message message, String modelOutputPrefix, DeepLeoConfiguration modelConfig, String fluxState, CancellationToken cancellationToken)+MoveNext() in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\Extensions\\DeepLeoOrchestrator.cs:line 1456\r\n   at BotClientLibrary.Extensions.DeepLeoOrchestrator.GetTokens(Boolean isStreamingEnabled, Conversation deepLeoConversation, Message message, String modelOutputPrefix, DeepLeoConfiguration modelConfig, String fluxState, CancellationToken cancellationToken)+MoveNext() in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\Extensions\\DeepLeoOrchestrator.cs:line 1456\r\n   at BotClientLibrary.Extensions.DeepLeoOrchestrator.GetTokens(Boolean isStreamingEnabled, Conversation deepLeoConversation, Message message, String modelOutputPrefix, DeepLeoConfiguration 
modelConfig, String fluxState, CancellationToken cancellationToken)+System.Threading.Tasks.Sources.IValueTaskSource<System.Boolean>.GetResult()\r\n   at BotClientLibrary.Extensions.DeepLeoOrchestrator.RunDeepLeoEngine(ExtensionRequest request, ExtensionResponse result, Conversation conversation, TuringBotConfiguration config, Message message, DeepLeoEngineState engineState, ModelPreClassification modelPreClassification) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\Extensions\\DeepLeoOrchestrator.cs:line 634\r\n   at BotClientLibrary.Extensions.DeepLeoOrchestrator.RunDeepLeoEngine(ExtensionRequest request, ExtensionResponse result, Conversation conversation, TuringBotConfiguration config, Message message, DeepLeoEngineState engineState, ModelPreClassification modelPreClassification) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\Extensions\\DeepLeoOrchestrator.cs:line 634\r\n   at BotClientLibrary.Extensions.DeepLeoOrchestrator.Run(ExtensionRequest request, CancellationToken cancellationToken) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\Extensions\\DeepLeoOrchestrator.cs:line 183\r\n   at BotClientLibrary.Extensions.DeepLeoOrchestrator.Run(ExtensionRequest request, CancellationToken cancellationToken) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\Extensions\\DeepLeoOrchestrator.cs:line 210\r\n   at BotClientLibrary.Extensions.ExtensionRunner.RunExtension(ExtensionRequest request, Conversation conversation, ExtensionConfig extension, ExtensionRequestOptions customOptions, ParsedToolInvocation action, String modelName, CancellationToken cancellationToken) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\Extensions\\ExtensionRunner.cs:line 642\r\n   at BotClientLibrary.Extensions.ExtensionRunner.RunExtensions(Conversation conversation, CancellationToken cancellationToken, ComponentPriority minPriority, ComponentPriority maxPriority, ExtensionRequestOptions customOptions, ExtensionRequest request, ParsedToolInvocation action, String modelName, Classification modelClassification, Boolean skipApiCheck) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\Extensions\\ExtensionRunner.cs:line 272\r\n   at BotClientLibrary.BotConnection.GetContentResponsesAsync(Conversation conversation, CancellationToken cancellationToken) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\BotConnection.cs:line 466\r\n   at BotClientLibrary.BotConnection.InternalRun(Conversation conversation, CancellationToken cancellationToken) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\BotConnection.cs:line 518\r\n   at BotClientLibrary.BotConnection.ExecuteBotTurn(Conversation conversation, CancellationToken cancellationToken) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\BotConnection.cs:line 209\r\n   at BotClientLibrary.BotConnection.ExecuteBotTurn(Conversation conversation, CancellationToken cancellationToken) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\BotConnection.cs:line 170\r\n   at BotClientLibrary.BotConnection.Run(CancellationToken cancellationToken) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\BotClientLibrary\\BotConnection.cs:line 80\r\n   
at Microsoft.Falcon.TuringBot.ChatApiImplementation.Run(BaseRequest request, BaseResponse response, CancellationToken cancellationToken) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\Service\\Implementation\\ApiImplementation\\ChatApiImplementation.cs:line 71\r\n   at Microsoft.Falcon.TuringBot.RequestProcessor.Run(BaseRequest request, BaseResponse response, IRequestContextInitializer contextInitializer, IRequestValidator requestValidator, IApiImplementation apiImplementation, IAsyncApiEndStep apiEndStep, String apiName, CancellationToken cancellationToken, Func`1 cancellationTokenProvider) in D:\\a\\_work\\1\\s\\services\\TuringBot\\src\\Service\\Implementation\\RequestProcessor.cs:line 181', 'serviceVersion': '20240229.252'}}}), receiver=@3e7c1a615c99084a53b1554e6b36fea45f913e23eece1c1151fa65cb6180bd59
image image
JayGarland commented 8 months ago

BTW I added some info related to the 成均館 in prompt of the bot's persona, so would it possibly cause the error?

JE-Chen commented 8 months ago

Maybe you need to set a proxy in China.

JE-Chen commented 8 months ago

That exception is Microsoft service exception.

JayGarland commented 8 months ago

Maybe you need to set a proxy in China.

thanks, my ip is clean.

JayGarland commented 8 months ago

That exception is Microsoft service exception.

I know, but it is actually an exception and caused no response, while I used another project which uses a similar logic to chat with the sydney, it won't get the error, I used the same context and and question ask the bot.

JE-Chen commented 8 months ago

If I use your prompt in my environment, it runs successfully. Can you try using code to ask the bot, not the CLI? I haven’t checked the CLI module for more than 5 months.

JayGarland commented 8 months ago

If I use your prompt in my environment, it runs successfully. Can you try using code to ask the bot, not the CLI? I haven’t checked the CLI module for more than 5 months.

Sorry I am not clear which code you asked to use? I currently just saw CLI and UI two ways to run the bot.

JE-Chen commented 8 months ago

Example:

import asyncio
import json
from pathlib import Path

from re_edge_gpt import Chatbot
from re_edge_gpt import ConversationStyle

# If you are using jupyter pls install this package
# from nest_asyncio import apply

async def test_ask() -> None:
    bot = None
    try:
        mode = "Bing"
        if mode == "Bing":
            cookies: list[dict] = json.loads(open(
                str(Path(str(Path.cwd()) + "/bing_cookies.json")), encoding="utf-8").read())
        else:
            cookies: list[dict] = json.loads(open(
                str(Path(str(Path.cwd()) + "/copilot_cookies.json")), encoding="utf-8").read())
        bot = await Chatbot.create(cookies=cookies, mode=mode)
        response = await bot.ask(
            prompt="成均馆大学好玩吗?",
            conversation_style=ConversationStyle.balanced,
            simplify_response=True,
            search_result=True
        )
        # If you are using non ascii char you need set ensure_ascii=False
        print(json.dumps(response, indent=2, ensure_ascii=False))
        # Raw response
        # print(response)
        assert response
    except Exception as error:
        raise error
    finally:
        if bot is not None:
            await bot.close()

if __name__ == "__main__":
    # If you are using jupyter pls use nest_asyncio apply()
    # apply()
    try:
        loop = asyncio.get_running_loop()
    except RuntimeError:
        loop = asyncio.get_event_loop()
    loop.run_until_complete(test_ask())
JayGarland commented 8 months ago

thanks your review, today I tested and using the same context, tried to reproduce that exception, but seems that error disappear