MicrosoftDocs / azure-docs

Open source documentation of Microsoft Azure
https://docs.microsoft.com/azure
Creative Commons Attribution 4.0 International
10.28k stars 21.46k forks source link

When running example, I get corrupted responses #113770

Closed GenesisMRainer closed 1 year ago

GenesisMRainer commented 1 year ago

When running the example, I get extra information and formatting in the result. It successfully connects and gets a result. I am using c#/.Net 7/Console. Examples are below...

Example Response1:

Input: When was Microsoft founded?
Chatbot: ";
  c[2].optionA = "April 4, 1975";
  c[2].optionB = "July 24, 1983";
  c[2].optionC = "January 3, 1976";
  c[2].optionD = "July 29, 1977";
  c[2].correctoption = "optionA";

  c[3].question = "What is the full form of PDF?";
  c

Example Response2:

Input: When was Microsoft founded?
Chatbot: '),
                     ('Microsoft', 'The Microsoft Corporation was established in 1975', 'When was Microsoft founded ?')],
            "Testcase 7.3": [("Who is CEO of Google?"), ('Google', "Sundar Pichai is the CEO of Google", "Who is CEO of Google")],
            "Testcase 7.4": [("Whats the time right now?"),
                              (None, "Sorry, I am not able to identify the date", "Whats

Example Response3:

Input: When was Microsoft founded?
Chatbot: ","_labels":"ORGANIZATION","_output":"Microsoft"},{"_input":"What is the headquarters of IBM?","_labels":"ORGANIZATION","_output":"Armonk, New York, United States"},{"_input":"What is the name of CEO of Microsoft?","_labels":"PERSON","_output":"Satya Nadella"},{"_input":"What is the name of the CEO of Apple?","_labels":"PERSON","_output":"Tim Cook"}]'
print(ner(text

My code:

using Azure;
using Azure.AI.OpenAI;

const string key = "redacted";
const string endpoint = "redacted";

// Enter the deployment name you chose when you deployed the model.
const string engine = "redacted";

OpenAIClient client = new(new Uri(endpoint), new AzureKeyCredential(key));

string prompt = "When was Microsoft founded?";
Console.Write($"Input: {prompt}\n");

Response<Completions> completionsResponse =
    await client.GetCompletionsAsync(engine, prompt);
string completion = completionsResponse.Value.Choices[0].Text;
Console.WriteLine($"Chatbot: {completion}");

Document Details

Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.

RamanathanChinnappan-MSFT commented 1 year ago

@GenesisMRainer Thanks for your feedback! We will investigate and update as appropriate.

RamanathanChinnappan-MSFT commented 1 year ago

@GenesisMRainer It seems like the response you are getting is not in the expected format. The response should only contain the text generated by the OpenAI model.

In your code, you are using the GetCompletionsAsync method to get the response from the OpenAI model. This method returns a Response object, which contains the generated text in the Text property of the Choice object.

To fix the issue, you can modify your code to only print the generated text by accessing the Text property of the first Choice object in the Choices array. Here is an example: string prompt = "When was Microsoft founded?"; Console.Write($"Input: {prompt}\n");

Response completionsResponse = await client.GetCompletionsAsync(engine, prompt); string completion = completionsResponse.Value.Choices[0].Text;

Console.WriteLine($"Chatbot: {completion}");*string prompt = "When was Microsoft founded?"; Console.Write($"Input: {prompt}\n");

Response completionsResponse = await client.GetCompletionsAsync(engine, prompt); string completion = completionsResponse.Value.Choices[0].Text;

Console.WriteLine($"Chatbot: {completion}"); This should only print the generated text in the console. If you are still getting extra information and formatting in the response, you may need to check the configuration of your OpenAI model or For an issue like this, I'd recommend you create a support ticket since the support team will be able to respond much more quickly and have a conversation with you to figure out what could be going on.

GenesisMRainer commented 1 year ago

I am already doing what you suggested as seen in the code I provided earlier. I will open a support ticket.

Thank you for your assistance.

mrbullwinkle commented 1 year ago

@GenesisMRainer what model/model version were you testing the code with?

To replicate the behavior in the quickstart with the completions endpoint we recommend using text-davinci-003. I tested again just now to confirm and I get the following as expected: (In the case of my code below my deployment name happens to match the underlying model name)

image

However, I think I am able to repro your issue if I test with gpt-35-turbo (0301) model in which case I can get output like the following:

image

gpt-35-turbo (0301) was an early release that can work with both the completions and chat completions endpoint. However, when using with the completions endpoint it is specifically trained to expect input in the ChatML format. Using gpt-35-turbo (0301) with the traditional prompt style will yield unexpected and low quality results. While it will work with the completions endpoint + ChatML it is no longer recommended as we have a new better API available.

If you want to use a gpt-35-turbo or gpt-4 model we recommend using the newer Chat Completions API

RamanathanChinnappan-MSFT commented 1 year ago

@mrbullwinkle

Thanks for your response.

RamanathanChinnappan-MSFT commented 1 year ago

@GenesisMRainer We are going to close this thread.if there are any further questions regarding the documentation, please tag me in your reply and we will be happy to continue the conversation.