Open eerhardt opened 2 weeks ago
Are you proposing that we introduce another hosting package as well Aspire.Hosting.OpenAI
? It probably doesn't give us much over AddConnectionString
other than discoverability.
Adding to the backlog because we'll definitely be doing this.
Are you proposing that we introduce another hosting package as well Aspire.Hosting.OpenAI?
Not necessarily "OpenAI" exactly. But I could imagine we could have Aspire.Hosting.Ollama
(and others) that ran in a container and exposed an OpenAI endpoint, which was connected to by the Aspire.OpenAI
component.
Since OpenAI itself is an existing service that you don't run locally, or provision, I'm not sure what an Aspire.Hosting.OpenAI
hosting package would do. But I'm interested in hearing other's thoughts.
cc @timheuer @luisquintanilla in case they have any thoughts around here.
Not sure how it maps out but there's a few things to consider:
The configuration depends on which service you're using. However, you might be able to simplify things by installing Azure.OpenAI package since that takes a dependency on OpenAI. Therefore, you get access to all underlying types:
var client= OpenAIClient(...);
var client = AzureOpenAIClient(...);
Given one of these clients for the respective service, users can now instantiate a client based on the task (chat / embedding / audio / images/ assistants)
For example, chat:
var chatClient = client.GetChatClient(...);
This works with either OpenAIClient or AzureOpenAIClient.
When using the SDKs directly, users would have to create the respective clients based on what they're looking to do.
Semantic Kernel is currently working on updating their abstractions to the most recent versions of the OpenAI libraries, but I suspect the way it'll work is, given an OpenAI / Azure OpenAI client (this is probably what the Aspire component would register for DI), the respective IChatCompletionService
, IEmbeddingGenerationService
, and other abstractions would handle the creation / management of task-specific clients.
https://github.com/microsoft/semantic-kernel/issues/6738
cc: @stephentoub @RogerBarreto
but I suspect the way it'll work is, given an OpenAI / Azure OpenAI client (this is probably what the Aspire component would register for DI), the respective IChatCompletionService, IEmbeddingGenerationService, and other abstractions would handle the creation / management of task-specific clients.
Yes.
Right now for example the AddOpenAIChatCompletion extension method supports querying DI for an OpenAIClient. The intent with the revised extensions based on the new OpenAI / Azure.AI.OpenAI libraries is to have AddOpenAIChatCompletion continue to look for an OpenAIClient and AddAzureOpenAIChatCompletion look for an AzureOpenAIClient. But obviously this can adapt based on what we collectively decide to do here with Aspire; whatever we do, we want to make sure it's a seamless experience up and down the stack.
The intent with the revised extensions based on the new OpenAI / Azure.AI.OpenAI libraries is to have AddOpenAIChatCompletion continue to look for an OpenAIClient and AddAzureOpenAIChatCompletion look for an AzureOpenAIClient.
One approach Aspire could take w.r.t. to DI is:
Aspire.OpenAI
adds
OpenAIClient
service to DIAspire.Azure.AI.OpenAI
adds
AzureOpenAIClient
service to DIOpenAIClient
serviceThat way someone just looking for an OpenAIClient
can bind to either, and people can switch between Azure and non-Azure, if they want. And if someone is specifically looking for an AzureOpenAIClient
, it will only work when using Aspire.Azure.AI.OpenAI
.
an AzureOpenAIClient service to DI and the same object as an OpenAIClient service
I was hypothesizing in a discussion earlier today that this is what we'd end up doing, assuming the abstraction holds up and everything available via the base OpenAIClient fully works when the concrete type is AzureOpenAIClient. @trrwilson, I assume that's the goal and anything that deviated from that would be a bug to be fixed?
an AzureOpenAIClient service to DI and the same object as an OpenAIClient service
I was hypothesizing in a discussion earlier today that this is what we'd end up doing, assuming the abstraction holds up and everything available via the base OpenAIClient fully works when the concrete type is AzureOpenAIClient. @trrwilson, I assume that's the goal and anything that deviated from that would be a bug to be fixed?
Right, with the exception of non-overlapping capabilities; e.g. you won't be able to get/use a ModerationClient
from AzureOpenAIClient
because Azure OpenAI doesn't have a standalone /moderations API.
an AzureOpenAIClient service to DI and the same object as an OpenAIClient service
I was hypothesizing in a discussion earlier today that this is what we'd end up doing, assuming the abstraction holds up and everything available via the base OpenAIClient fully works when the concrete type is AzureOpenAIClient. @trrwilson, I assume that's the goal and anything that deviated from that would be a bug to be fixed?
Right, with the exception of non-overlapping capabilities; e.g. you won't be able to get/use a
ModerationClient
fromAzureOpenAIClient
because Azure OpenAI doesn't have a standalone /moderations API.
Thanks. It looks like these are the only two OpenAIClient methods today that will throw: https://github.com/Azure/azure-sdk-for-net/blob/aec1a1389636a2ef76270ab4bdcb0715a2abb1aa/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIClient.cs#L230-L242 Are there other not-supported things on individual clients, or the expectation is if you can get the relevant client everything "just works"?
Thanks. It looks like these are the only two OpenAIClient methods today that will throw: https://github.com/Azure/azure-sdk-for-net/blob/aec1a1389636a2ef76270ab4bdcb0715a2abb1aa/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIClient.cs#L230-L242 Are there other not-supported things on individual clients, or the expectation is if you can get the relevant client everything "just works"?
There are a small number of errata, but overwhelmingly it's that expectation: if you can get the scenario client, you should then be able to interact with that scenario client (inputs/outputs) the same way -- without any undue consideration of whether it came from OpenAI v1 or an Azure OpenAI endpoint.
I'd very-much like to have at-least an option to use the .NET OpenAI client. It seems like the natural option for .NET customers and not having it in Aspire is somewhat unfortunate.
Ideally, I could just pass an instance of either the Azure OpenAI Client or the .NET OpenAI client into a method/constructor and "it just work." Seems like, if they're that close to structure, we could get by putting them both in the package and giving the, both to customers.
I know it's harder than that and stuff, but, that'd be an ideal middle-of-the-road approach for folks in both cohorts.
Today we have the Azure.AI.OpenAI component when using Azure OpenAI.
Last week an official OpenAI library was introduced with https://devblogs.microsoft.com/dotnet/openai-dotnet-library/. We should consider having a plain
Aspire.OpenAI
component that wraps this library for the cases where an Aspire app wants to use OpenAI, but doesn't want to bring in Azure dependencies.cc @sebastienros @tg-msft @davidfowl @mitchdenny