Closed nisarkhan916 closed 2 years ago
Label prediction was below confidence level 0.6
for Model:ServiceLabels
: 'Azure.Identity:0.26039332,Storage:0.078406036,Search:0.057244003'
Facing the same issue. Any updates?
Thank you for your feedback. Tagging and routing to the team member best able to assist.
There were quite a few changes to the models and generated code. Please upgrade to our beta 3 package we just released earlier this week and see if the problem still reproduces.
@heaths - I'm using the beta 3 library and still facing the same issue. I'm not able to test the deployed model through the language studio as well. I'm facing this while analysing the conversation via Orchestrator Workflow.
I'm using this package - Azure.AI.Language.Conversations 1.0.0-beta.3
Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @cahann, @kayousef.
Author: | nisarkhan916 |
---|---|
Assignees: | heaths |
Labels: | `Service Attention`, `Client`, `needs-author-feedback`, `customer-reported`, `question`, `Cognitive - LUIS` |
Milestone: | - |
Adding the service people since you're getting the same issue in Language Studio. Would it be acceptable to share an example query? Given it's a BadRequest
, this may be helpful.
/cc @ChongTang
Adding the service people since you're getting the same issue in Language Studio. Would it be acceptable to share an example query? Given it's a
BadRequest
, this may be helpful./cc @ChongTang
Not sure what you mean by the query? OP @nisarkhan916 has already added the screenshots of the language studio.
When I tried to inspect the network call from Language Studio, it looks like it is also trying to call the same Rest API and getting the same error we get through the SDK/Rest API.
Adding the REST API Call for your reference,
Request,
curl --location --request POST 'https://westeurope.api.cognitive.microsoft.com/language/:analyze-conversations?api-version=2021-11-01-preview&projectName=xx&deploymentName=xx' \
--header 'Ocp-Apim-Subscription-Key: xx' \
--header 'Content-Type: application/json' \
--data-raw '{
"query": "hello"
}'
Response,
{
"error": {
"code": "BadRequest",
"message": "One or more errors occurred. Corresponding activity ID: xx-xx-xx-xx-xx.",
"innerError": {
"message": "Failed to call the Question_Answering project xx and deployment xx.."
}
}
}
There is an issue with the back-end service. The fix will be deployed next week. Will update this thread.
Hello @nisarkhan916 . This has been fixed already. Can you try again?
Thanks @ChongTang , its working now.
Library name and version
Azure.AI.Language.Conversation v1.0.0-beta.2 (pre-release)
Describe the bug
After creating a Orchestration Workflow in language studio, Getting the below exception while fetching the results from Orchestrator workflow. Specifically whenever we are trying to fetch results from Custom question answering (CQA) of Orchestrator, we are facing BadRequest: One or more errors occurred. Corresponding activity ID: d72d64e8-cdf0-4ad3-8d22-d0e3b11ce883.
Also sometimes when we are trying to fetch results from Conversation language understanding(CLU) of Orchestrator we are facing again the same issue.
Note: Attached the screenshot related to the exceptions faced.
Thanks in advance...
Expected behavior
As mentioned in Language studio documentation, we should get results from orchestration workflow containing CLU intent/QnA question along with scores.
Actual behavior
Specifically whenever we are trying to fetch results from Custom question answering (CQA) of Orchestrator, we are facing *BadRequest: One or more errors occurred. Corresponding activity ID: d72d64e8-cdf0-4ad3-8d22-d0e3b11ce883. *
Reproduction Steps
AzureKeyCredential credential = new AzureKeyCredential("{LanguageServiceSubcriptionKey}"); Uri endpoint = new Uri("https://{Endpoint}.cognitiveservices.azure.com"); ConversationAnalysisClient client = new ConversationAnalysisClient(endpoint, credential); ConversationsProject orchestrationProject = new ConversationsProject("OrchestrationTestProject", "OrchestratornewModel"); Response response = await client.AnalyzeConversationAsync({Query}, orchestrationProject);
//Getting the Predicted Response
OrchestratorPrediction orchestratorPrediction = response.Value.Prediction as OrchestratorPrediction;
Environment
VS-2019 .Net 3.1