Closed GenesisMRainer closed 1 year ago
@GenesisMRainer Thanks for your feedback! We will investigate and update as appropriate.
@GenesisMRainer It seems like the response you are getting is not in the expected format. The response should only contain the text generated by the OpenAI model.
In your code, you are using the GetCompletionsAsync method to get the response from the OpenAI model. This method returns a Response
To fix the issue, you can modify your code to only print the generated text by accessing the Text property of the first Choice object in the Choices array. Here is an example: string prompt = "When was Microsoft founded?"; Console.Write($"Input: {prompt}\n");
Response
Console.WriteLine($"Chatbot: {completion}");*string prompt = "When was Microsoft founded?"; Console.Write($"Input: {prompt}\n");
Response
Console.WriteLine($"Chatbot: {completion}"); This should only print the generated text in the console. If you are still getting extra information and formatting in the response, you may need to check the configuration of your OpenAI model or For an issue like this, I'd recommend you create a support ticket since the support team will be able to respond much more quickly and have a conversation with you to figure out what could be going on.
I am already doing what you suggested as seen in the code I provided earlier. I will open a support ticket.
Thank you for your assistance.
@GenesisMRainer what model/model version were you testing the code with?
To replicate the behavior in the quickstart with the completions endpoint we recommend using text-davinci-003. I tested again just now to confirm and I get the following as expected: (In the case of my code below my deployment name happens to match the underlying model name)
However, I think I am able to repro your issue if I test with gpt-35-turbo (0301) model in which case I can get output like the following:
gpt-35-turbo (0301) was an early release that can work with both the completions and chat completions endpoint. However, when using with the completions endpoint it is specifically trained to expect input in the ChatML format. Using gpt-35-turbo (0301) with the traditional prompt style will yield unexpected and low quality results. While it will work with the completions endpoint + ChatML it is no longer recommended as we have a new better API available.
If you want to use a gpt-35-turbo or gpt-4 model we recommend using the newer Chat Completions API
@mrbullwinkle
Thanks for your response.
@GenesisMRainer We are going to close this thread.if there are any further questions regarding the documentation, please tag me in your reply and we will be happy to continue the conversation.
When running the example, I get extra information and formatting in the result. It successfully connects and gets a result. I am using c#/.Net 7/Console. Examples are below...
Example Response1:
Example Response2:
Example Response3:
My code:
Document Details
⚠ Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.