ahyatt / llm

A package abstracting llm capabilities for emacs.
GNU General Public License v3.0
178 stars 24 forks source link

Error callback not called if url request failed #37

Closed s-kostyaev closed 6 months ago

s-kostyaev commented 6 months ago

Hi @ahyatt

Simple code for reproducing:

(require 'llm-ollama)
(setq provider
      (make-llm-ollama
       :chat-model "1" :embedding-model "2" :port 3333))

(require 'llm-openai)
(setq llm-warn-on-nonfree nil)
(setq provider
      (make-llm-openai-compatible
       :key "0"
       :chat-model "1" :embedding-model "2" :url "http://localhost:3333"))

(llm-chat-streaming provider (llm-make-simple-chat-prompt "test") #'ignore #'ignore
            (lambda (_)
              (message "error callback called")))

There is no process listening port 3333. Reproducing with both providers.

Message will never received.

ahyatt commented 6 months ago

This is fixed, but keep in mind you can still get errors thrown from the initial (sync) part of llm calls. For example, bad providers may throw errors, such as when Open AI providers aren't initialized with a key. It's just that all async parts should go to the callback.