emacs-openai / openai

Elisp library for the OpenAI API
GNU General Public License v3.0
100 stars 17 forks source link

Add option to have user defined `params` for the OpenAI request #18

Closed JCZuurmond closed 1 year ago

JCZuurmond commented 1 year ago

Why?

To use the Azure-hosted OpenAI rest API the api-version has to be provided in the query:

POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/completions?api-version=2022-12-01

At the moment, I can use this library by overriding the openai-request method, like so:

  (defmacro openai-request (url &rest body)
    "Wrapper for `request' function.

  The URL is the url for `request' function; then BODY is the arguments for rest."
    (declare (indent 1))
    `(progn
       (setq openai-error nil)
       (request ,url
         :params '(("api-version" . "2023-03-15-preview"))
         :error (cl-function
                 (lambda (&key response &allow-other-keys)
                   (setq openai-error response)
                   (openai--handle-error response)))
         ,@body))))

I would like it if the library supports this feature.

What?

A more generic approach would be:

  1. Create a variable
(defcustom openai-request-parameters '()
  "The parameters for the OpenAI request."
  :type 'list
  :group 'openai)

(Maybe name the variable openai-user-defined-request-parameters to emphasize that the library might add other parameters.)

  1. Then pass the parameters to all OpenAI request, for example in the chat:
(cl-defun openai-chat ( messages callback
                        &key
                        (base-url openai-base-url)
                        (params openai-user-defined-params)
                        (content-type "application/json")
                        (key openai-key)
                        org-id
                        (model "gpt-3.5-turbo")
                        temperature
                        top-p
                        n
                        stream
                        stop
                        max-tokens
                        presence-penalty
                        frequency-penalty
                        logit-bias
                        (user openai-user))
  "Send chat request.

Arguments MESSAGES and CALLBACK are required for this type of request.  MESSAGES
is the conversation data.  CALLBACK is the execuation after request is made.

Arguments BASE-URL, PARAMS, CONTENT-TYPE, KEY, ORG-ID and USER are global options; however, you
can overwrite the value by passing it in.

The rest of the arugments are optional, please see OpenAI API reference page
for more information.  Arguments here refer to MODEL,  TEMPERATURE, TOP-P, N,
STREAM, STOP, MAX-TOKENS, PRESENCE-PENALTY, FREQUENCY-PENALTY, and LOGIT-BIAS."
  (openai-request (concat base-url "/chat/completions")
    :type "POST"
    :params params         ;; <--- does this need an `@` or `,` prefix?
    :headers (openai--headers content-type key org-id)
    :data (openai--json-encode
           `(("model"             . ,model)
             ("messages"          . ,messages)
             ("temperature"       . ,temperature)
             ("top-p"             . ,top-p)
             ("n"                 . ,n)
             ("stream"            . ,stream)
             ("stop"              . ,stop)
             ("max_tokens"        . ,max-tokens)
             ("presence_penalty"  . ,presence-penalty)
             ("frequency_penalty" . ,frequency-penalty)
             ("logit_bias"        . ,logit-bias)
             ("user"              . ,user)))
    :parser 'json-read
    :complete (cl-function
               (lambda (&key data &allow-other-keys)
                 (funcall callback data)))))
  1. I think the macro passes the parameter automagically to the underlying request:
(defmacro openai-request (url &rest body)
  "Wrapper for `request' function.

The URL is the url for `request' function; then BODY is the arguments for rest."
  (declare (indent 1))
  `(progn
     (setq openai-error nil)
     (request ,url
       :error (cl-function
               (lambda (&key response &allow-other-keys)
                 (setq openai-error response)
                 (openai--handle-error response)))
       ,@body)))
jcs090218 commented 1 year ago

Yeah, this is a great idea! Would you like to open a PR for this? :D