microsoft / semantic-kernel

Integrate cutting-edge LLM technology quickly and easily into your apps
https://aka.ms/semantic-kernel
MIT License
21.36k stars 3.14k forks source link

Custom Endpoint for OpenAI #2145

Closed victor-v-rad closed 1 year ago

victor-v-rad commented 1 year ago

Companies create internal API proxies to avoid direct calls to https://api.openai.com/. The Semantic Kernel does not provide the possibility to specify a custom endpoint. Please see the picture.

image

Please create this possibility.

victor-v-rad commented 1 year ago

Created according issue for Azure SDK https://github.com/Azure/azure-sdk-for-net/issues/37794

gingters commented 1 year ago

Just to second that, if you want to use Semantic Kernel to enrich your Blazor WebAssembly application with AI capabilities, you for sure don't want to provide your API key to the browser ;)

It's easy to add a Yarp.ReverseProxy to your server that authenticatces your users based on your normal auth mode (Cookies, Tokens etc.) redirects all calls to /openai to your Azure OpenAI endpoints and add the secret Api Key to these requests on the server side, but since you can't set the base url for the pure OpenAI endpoint its sadly not possible to use these from your web application atm.

JadynWong commented 1 year ago

Temporary solution

https://github.com/microsoft/semantic-kernel/issues/1445#issuecomment-1590980113

https://github.com/microsoft/semantic-kernel/issues/1445#issuecomment-1605622515

private class ProxyOpenAIHandler : HttpClientHandler
{
    protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
    {
        if (request.RequestUri != null && request.RequestUri.Host.Equals("api.openai.com", StringComparison.OrdinalIgnoreCase))
        {
            // your proxy url
            request.RequestUri = new Uri($"https://openai.yourproxy.com/{request.RequestUri.PathAndQuery}");
        }
        return base.SendAsync(request, cancellationToken);
    }
}

var kernel = Kernel.Builder
        .WithOpenAIChatCompletionService("<model-id>", "<api-key>", httpClient: new HttpClient(new ProxyOpenAIHandler()))
        .Build();
craigomatic commented 1 year ago

Temporary solution

#1445 (comment)

#1445 (comment)

private class ProxyOpenAIHandler : HttpClientHandler
{
    protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
    {
        if (request.RequestUri != null && request.RequestUri.Host.Equals("api.openai.com", StringComparison.OrdinalIgnoreCase))
        {
            // your proxy url
            request.RequestUri = new Uri($"https://openai.yourproxy.com/{request.RequestUri.PathAndQuery}");
        }
        return base.SendAsync(request, cancellationToken);
    }
}

var kernel = Kernel.Builder
        .WithOpenAIChatCompletionService("<model-id>", "<api-key>", httpClient: new HttpClient(new ProxyOpenAIHandler()))
        .Build();

This is the solution I would recommend also.

Does this unblock both of your scenarios @victor-v-rad and @gingters ?

For the Chat Copilot (fka Copilot Chat and now lives here) you'd need to either modify the deployment scripts to pass your proxy through as a config value that your code picks up, or just leave as-is and hardcode your proxy. In both cases the code @JadynWong describes above should enable this.

victor-v-rad commented 1 year ago

Thank you, @JadynWong, for sharing the code snippets. That unblocks me. There is also a standard proxy configuration in the Azure SDK. https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/core/Azure.Core/samples/Configuration.md#configuring-a-proxy It might be good to mention this in the README.md or here in appsettings.json > AIService. https://github.com/microsoft/chat-copilot/blob/main/webapi/appsettings.json#L22-L48

@craigomatic, thank you for sharing the new Chat Copilot repo. It's worth celebrating! @JadynWong are you ok if I close the issue?

JadynWong commented 1 year ago

Of course, if that solves your problem.

the-xentropy commented 7 months ago

For anyone searching for a quick and easy way to do this in Python:

import semantic_kernel as sk
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion

service = OpenAIChatCompletion(ai_model_id="<MODEL NAME>", api_key=<API KEY>)
service.client.base_url = "<API URL>"
kernel.add_chat_service(
    "chat_completion",
    service,
)

The underlying AsyncOpenAI client the semantic-kernel wrapper uses has the option to provide the base url in the initializer, but semantic-kernel doesn't expose it as a keyword parameter in the wrapper init function. You can still set it post-initialization by accessing the wrapper's underlying OpenAI client though.

I think it would be preferable for this keyword argument to simply be exposed in the wrapper init function, since this is prone to breakages if e.g. internal variable names change, but it works and is a 30 second one-line fix so it's good enough for me for now.