xenodium / chatgpt-shell

ChatGPT and DALL-E Emacs shells + Org babel 🦄 + a shell maker for other providers
https://xenodium.com
GNU General Public License v3.0
821 stars 74 forks source link

[Feature request] Use built-in url.el #13

Closed xenodium closed 10 months ago

xenodium commented 1 year ago

Consider removing the dependency on curl and use built-in url.el instead.

rkallio commented 1 year ago

Using url.el would make parsing http headers a bit easier. There's some interesting stuff like X-Rate-Limit-* headers that might be useful to have access to.

The response body also contains a usage key that should be parsed, and the contents (spent tokens) represented somehow, maybe using header-line-format?

ayman commented 1 year ago

So...I got this going on v0.1. Just replace chatgpt-shell--async-shell-command with chatgpt-shell--async-http-post.

(require 'url)

(defun chatgpt-shell--receive-http-post (status)
  "Async callback to process from HTTP Post request with an unused STATUS."
  (chatgpt-shell--write-output-to-log-buffer "// Response (from http-post)\n\n")
  (with-current-buffer (current-buffer)
    (chatgpt-shell--write-output-to-log-buffer (format "%s\n" (cadr (split-string (buffer-string) "\n\n"))))
    (if-let ((content (chatgpt-shell--extract-content (cadr (split-string (buffer-string) "\n\n")))))
        (chatgpt-shell--write-reply content)
      (chatgpt-shell--write-reply "Error: that's all I know" t))
    (setq chatgpt-shell--busy nil)))

(defun chatgpt-shell--async-http-post (messages key)
  "Send MESSAGES to API as a POST request with API KEY."
  (cl-assert chatgpt-shell-openai-key nil "`chatgpt-shell-openai-key' needs to be set with your key")
  (let* ((url "https://api.openai.com/v1/chat/completions")
         (url-request-method "POST")
         (url-request-extra-headers `(("Content-Type" . "application/json")
                                      ("Authorization" . ,(format "Bearer %s" key))))
         (request-data `((model . ,chatgpt-shell-model-version)
                         (messages . ,messages))))
    (when chatgpt-shell-model-temperature
      (push `(temperature . ,chatgpt-shell-model-temperature) request-data))
    (let* ((args (json-serialize request-data))
           (url-request-data args))
      (chatgpt-shell--write-output-to-log-buffer "// Calling API via http-post\n")
      (chatgpt-shell--write-output-to-log-buffer (format "%s\n" url))
      (chatgpt-shell--write-output-to-log-buffer (format "%s\n" args))
      (url-retrieve url 'chatgpt-shell--receive-http-post))))

Then I saw there's a v0.3 now, so a pull request would take a little more effort...thoughts on removing curl from the stack?

xenodium commented 1 year ago

Thanks for this @ayman! @rkallio also has a promising patch in https://github.com/xenodium/chatgpt-shell/pull/18

In principle and from an elisp perspective, it sounds great to remove curl as an external dependency, but I'd also like to be a little mindful here and migrate only if we aren't regressing experience, functionality, stability, error handling, or adding complexity.

For now, taking the pragmatic approach of accepting curl since it's working fairly well and realistically not a big deal as it's a prevalent utility. Coincidentally, I'm a happy elfeed user and while it used to support url-retrieve, it has moved away from url-retrieve to curl. See Elfeed, cURL, and You.

I appreciate there's interest in removing curl, so I've been making the code more configurable and generic so keen users have the option of switching to another fetcher (ie. url-retrieve) as a config option via chatgpt-shell-request-maker. I hope this is a reasonable compromise.

ayman commented 1 year ago

Thanks for this update (and for xref #18 here). I'll also add moving off curl in my little experiment wasn't really more performant (API calls are already slowish as is). Also thanks for this cool package. 👍🏽

hajovonta commented 1 year ago

It seems it requires a newer curl than I have in my official package manager: You need curl version 7.76 or newer. is there a particular reason?

xenodium commented 1 year ago

is there a particular reason?

use --fail-with-body for better error handling. More details at https://github.com/xenodium/chatgpt-shell/issues/33 and https://github.com/xenodium/chatgpt-shell/commit/5a248401a6eb651633a0fe1457ec78482120030c

edit: formatting

hajovonta commented 1 year ago

so, just to see if I understand it correctly, and for the record, older versions of curl do not have this switch?

xenodium commented 1 year ago

older versions of curl do not have this switch?

Check out https://daniel.haxx.se/blog/2021/02/11/curl-fail-with-body

hajovonta commented 1 year ago

Got it, thanks! My issue is that I use Ubuntu and older versions (I have 20.x) won't have this curl version (like, never). My vote is on having a native alternative.

I ended up removing curl and installed it from source, but this approach has problems on its own. Anyway, I'm now able to use the package!

beatboxchad commented 1 year ago

What about using Python, and OpenAPI's client?

I'll start with some cons: This is, aesthetically, the opposite of what this issue proposes. I could also see the argument that it adds needless complexity and affects the stability of the interface.

Pros: Since Python is a mature, decently-portable language, and the API client is maintained by an org with resources, it could provide a stable platform to build integrations atop, leaving elisp to just deal with IPC and text. It could outperform cURL in this area -- I was inspired by https://github.com/openai/openai-cookbook/blob/main/examples/How_to_handle_rate_limits.ipynb. Sure, it's unlikely we'll hit rate limits noodling around in a text editor... or is it? It's also easier to get the bot to write Python for you.

I'd also like to be a little mindful here and migrate only if we aren't regressing experience, functionality, stability, error handling, or adding complexity.

Yeah, that's my professional instinct too. But this is a new piece of software and we're all hackers here -- the time to experiment might be now, because you're definitely right about this later.

xenodium commented 1 year ago

Thanks for contributing to the discussion!

On a personal/subjective level, I'd prefer avoiding python package dependencies. I'm not super well-versed in python and would rather avoid pip usage or equivalent as an installation prerequisite (it adds friction to users of the package).

I'm open to reconsider in the future, but switching would have to provide a pretty compelling upgrade (maybe something fairly desirable that's not possible without the python package) as the tradeoff is high IMO.

Having said all of this, the fetching mechanism is fairly isolated so folks can technically experiment with swapping it.

There's also shell-maker, which should help with much of the heavy lifting to create new LLM shells.

ingorichter commented 1 year ago

Is there any effort to have an alternative to use curl? Making curl the default while implementing a new fetcher should be a viable way of moving in this direction. I agree, nobody really cares how the transport works unless it breaks

xenodium commented 1 year ago

is there any effort to have an alternative to use curl?

At present, swapping the fetching mechanism isn't a project goal. Curl is working fairly well and most errors or edge cases seem to be handled gracefully.

A curl advantage I had not foreseen is troubleshooting. When package logging is enabled, users can see the exact curl command and run it in a separate terminal if needed.

Having said all of this, the current codebase should be modular enough to enable swapping and supporting multiple fetchers if someone is keen to contribute. Curl would probably remain the default setting unless the alternative has compelling features and equal or better error/edge-case handling.

ingorichter commented 1 year ago

thanks for the explanation. That makes sense and it's more beneficial to build and improve other feature instead

xenodium commented 10 months ago

Gonna close this one. Curl has been doing the job for a little while now.