POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/completions?api-version=2022-12-01
At the moment, I can use this library by overriding the openai-request method, like so:
(defmacro openai-request (url &rest body)
"Wrapper for `request' function.
The URL is the url for `request' function; then BODY is the arguments for rest."
(declare (indent 1))
`(progn
(setq openai-error nil)
(request ,url
:params '(("api-version" . "2023-03-15-preview"))
:error (cl-function
(lambda (&key response &allow-other-keys)
(setq openai-error response)
(openai--handle-error response)))
,@body))))
I would like it if the library supports this feature.
What?
A more generic approach would be:
Create a variable
(defcustom openai-request-parameters '()
"The parameters for the OpenAI request."
:type 'list
:group 'openai)
(Maybe name the variable openai-user-defined-request-parameters to emphasize that the library might add other parameters.)
Then pass the parameters to all OpenAI request, for example in the chat:
(cl-defun openai-chat ( messages callback
&key
(base-url openai-base-url)
(params openai-user-defined-params)
(content-type "application/json")
(key openai-key)
org-id
(model "gpt-3.5-turbo")
temperature
top-p
n
stream
stop
max-tokens
presence-penalty
frequency-penalty
logit-bias
(user openai-user))
"Send chat request.
Arguments MESSAGES and CALLBACK are required for this type of request. MESSAGES
is the conversation data. CALLBACK is the execuation after request is made.
Arguments BASE-URL, PARAMS, CONTENT-TYPE, KEY, ORG-ID and USER are global options; however, you
can overwrite the value by passing it in.
The rest of the arugments are optional, please see OpenAI API reference page
for more information. Arguments here refer to MODEL, TEMPERATURE, TOP-P, N,
STREAM, STOP, MAX-TOKENS, PRESENCE-PENALTY, FREQUENCY-PENALTY, and LOGIT-BIAS."
(openai-request (concat base-url "/chat/completions")
:type "POST"
:params params ;; <--- does this need an `@` or `,` prefix?
:headers (openai--headers content-type key org-id)
:data (openai--json-encode
`(("model" . ,model)
("messages" . ,messages)
("temperature" . ,temperature)
("top-p" . ,top-p)
("n" . ,n)
("stream" . ,stream)
("stop" . ,stop)
("max_tokens" . ,max-tokens)
("presence_penalty" . ,presence-penalty)
("frequency_penalty" . ,frequency-penalty)
("logit_bias" . ,logit-bias)
("user" . ,user)))
:parser 'json-read
:complete (cl-function
(lambda (&key data &allow-other-keys)
(funcall callback data)))))
I think the macro passes the parameter automagically to the underlying request:
(defmacro openai-request (url &rest body)
"Wrapper for `request' function.
The URL is the url for `request' function; then BODY is the arguments for rest."
(declare (indent 1))
`(progn
(setq openai-error nil)
(request ,url
:error (cl-function
(lambda (&key response &allow-other-keys)
(setq openai-error response)
(openai--handle-error response)))
,@body)))
Why?
To use the Azure-hosted OpenAI rest API the
api-version
has to be provided in the query:At the moment, I can use this library by overriding the
openai-request
method, like so:I would like it if the library supports this feature.
What?
A more generic approach would be:
(Maybe name the variable
openai-user-defined-request-parameters
to emphasize that the library might add other parameters.)