betalgo / openai

OpenAI .NET sdk - Azure OpenAI, ChatGPT, Whisper, and DALL-E
https://betalgo.github.io/openai/
MIT License
2.88k stars 516 forks source link

Question: get started with AzureOpenAI #181

Closed WeihanLi closed 1 year ago

WeihanLi commented 1 year ago

Great thanks for the project, I'm trying to use it with Azure OpenAI service, but do not know how to find the DeploymentId, could you please help, thanks

WeihanLi commented 1 year ago

I'm using the following code for the AzureOpenAI

var options = new OpenAiOptions()
{
    BaseDomain = " https://spark.openai.azure.com",
    ApiKey = apiKey,
    DeploymentId = "gpt35",
    ResourceName = "spark",
    ProviderType = ProviderType.Azure,
};
var openAiService = new OpenAIService(options);
var completionResult = await openAiService.ChatCompletion.CreateCompletion(new ChatCompletionCreateRequest
{
    Messages = new List<ChatMessage>
    {
        ChatMessage.FromSystem("You are a helpful assistant."),
        ChatMessage.FromUser("What's new in .NET 6")
    },
    Model = Models.ChatGpt3_5Turbo // "gpt-35-turbo" also not working
});

completionResult has error as follows:

{"Code":"404","Message":"Resource not found"}

While I could call the AzureOpenAI API by following the rest api docs, after some research, I found that the API is not the same, wondering that maybe there's something wrong with my way calling AzureOpenAI.

For details, the raw Azure OpenAI request likes below:

POST https://spark.openai.azure.com/openai/deployments/gpt35/completions?api-version=2022-12-01
Host: spark.openai.azure.com
Content-Type: application/json
api-key: <api-key>

{
  "prompt": "<|im_start|>system\nThe system is a helpful AI assistant.\n<|im_end|>\n<|im_start|>user\nWhat's new with .NET 6?\n<|im_end|>\n<|im_start|>assistant",
  "max_tokens": 800,
  "temperature": 1,
  "frequency_penalty": 0,
  "presence_penalty": 0,
  "top_p": 0.95,
  "stop": ["<|im_end|>"]
}

The request fired with before code like follows:

POST https://spark.openai.azure.com/openai/deployments/gpt35/chat/completions?api-version=2022-12-01 HTTP/1.1
Host: spark.openai.azure.com
api-key: <api-key>
Content-Type: application/json; charset=utf-8

{"messages":[{"role":"system","content":"You are a helpful assistant."},{"role":"user","content":"What\u0027s new in .NET 6"}],"model":"gpt-3.5-turbo"}

And besides, the model deployment I created as follows:

image

kayhantolga commented 1 year ago

I recently got access to Azure OpenAI. I couldn't find a chance to try services before releasing the SDK. I will check this issue as soon as possible. I have used this documentation as a reference, it may help you too. https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference

realchrisparker commented 1 year ago

I am tracking this also as I started using this library today with Azure OpenAI service and have yet to get it working.

kayhantolga commented 1 year ago

It appears that Azure has decided to implement a different approach from that of the OpenAI team. Additionally, they have chosen different model names. 😮‍💨 I will address these issues in the next version.

https://learn.microsoft.com/en-gb/azure/cognitive-services/openai/how-to/chatgpt

realchrisparker commented 1 year ago

I did some quick test code with Azure OpenAI and this worked.

CompletionMessage completionMessage = new( message ); HttpRequestMessage request = new() { Method = HttpMethod.Post, RequestUri = new Uri( $"https://{resourceName}.openai.azure.com/openai/deployments/{deploymentName}/completions?api-version=2022-12-01" ), Content = new StringContent( completionMessage.ToString(), Encoding.UTF8, "application/json" ), }; request.Headers.Clear(); request.Headers.Accept.Add( new MediaTypeWithQualityHeaderValue( "application/json" ) ); request.Headers.Add( "api-key", apiKey );

            var response = await _httpClient.SendAsync( request );

Not saying this is the code you should use but giving guidance on what normally works with httpclient. The completionMessage is a quick class I created mimicking the following JSON. { "prompt": "", "temperature": 0.7, "top_p": 0.95, "frequency_penalty": 0, "presence_penalty": 0, "max_tokens": 800, "stop": [ "<|im_end|>" ] }

doggy8088 commented 1 year ago

@kayhantolga May I ask which Betalgo.OpenAI.GPT3 version can I use for AOAI?

Also, I think the ResourceName property is not necessary for OpenAiOptions in AOAI.

magols commented 1 year ago

Hello, I just got access to Azure OpenAI GPT-4 at work today and got my existing app (OpenAI API 3.5-turbo) working by changing the Program.cs to

builder.Services.AddOpenAIService(options =>
{
    options.ProviderType = ProviderType.Azure;
    options.ResourceName = builder.Configuration["OpenAIServiceOptions:ResourceName"];
    options.DeploymentId = builder.Configuration["OpenAIServiceOptions:DeploymentId"];
    options.ApiVersion = "2023-03-15-preview";
    options.ApiKey = builder.Configuration["OpenAIServiceOptions:ApiKey"];

});

and changing the model from Models.ChatGpt3_5Turbo to "gpt-4" (the 8K version) to like this.


 var stream =  Ai.ChatCompletion.CreateCompletionAsStream(new ChatCompletionCreateRequest()
                {
                    Model = "gpt-4",
                    MaxTokens = 2000,
                    Temperature = 0.9f,
                    Messages = conv.Messages.Select(m => new ChatMessage(m.Role, m.Content)).ToList()

                });

Initially I got a 404 Not Found until I spied on the Azure OpenAI Studio Chat playground and saw it was using a different API version. When plugging in the "2023-03-15-preview" as version and also using the new model string I now get chat completions!

remixtedi commented 1 year ago

If the client is configured to use Azure OpenAI service then Image generation doesn't work because it is not available on Azure, is there a way to have the client configured for both platforms? to use Azure for chat completions and OpenAI Dalle for images?

kayhantolga commented 1 year ago

Released a new version, Please have a look at the wiki page. https://github.com/betalgo/openai/wiki/Azure-OpenAI