Remember that your API key is a secret! Do not share it with others or expose it in any client-side code (browsers, apps). Production requests must be routed through your own backend server where your API key can be securely loaded from an environment variable or key management service.
and basic industrial safety you should not call OpenAI API directly.
With this PR you have an option to implement OpenAISwift class with custom Config which can point into custom proxy server.
Example of usage:
let openAISwift = OpenAISwift(
config: OpenAISwift.Config(
baseURL: "https://api.openai.myproxy.com",
endpointPrivider: OpenAIEndpointProvider(source: .proxy(path: { api -> String in
switch api {
case .completions:
return "/completions"
case .edits:
return "/edits"
case .chat:
return "/chat/completions"
case .images:
return "/images/generations"
case .embeddings:
return "/embeddings"
}
}, method: { api -> String in
return "POST"
})),
session: URLSession.shared,
authorizeRequest: { request in
///
///here your can have custom authorization of each request following to your server, for ex. add "X-App-Token" or "X-Encryption-Token" or any other type of header
///
request.setValue("<super_secure_token>", forHTTPHeaderField: "X-App-Token")
}))
The goal for these changes was to have support of custom server with minimum code changes and follow existed logic and architecture.
To follow OpenAI requirements
and basic industrial safety you should not call OpenAI API directly.
With this PR you have an option to implement
OpenAISwift
class with customConfig
which can point into custom proxy server.Example of usage:
The goal for these changes was to have support of custom server with minimum code changes and follow existed logic and architecture.