Open oatmealm opened 5 months ago
Show your configuration please (remove keys before sharing).
The error echoed to the mini-buffer is "peculiar problem" literary, I forgot to mention that.
(setopt ellama-provider
(make-llm-openai-compatible
:url "http://localhost:4000"
:key "some=key"
:chat-model "some-mode"
:embedding-model "some-embeding-model"))
Everything works except for ellama-translate
and ellama-translate-buffer
. There's also not api access, looking at litellm
active log. Other commands, such as ellama-define-word
or general queries with ellama-chat
etc. seem to work fine.
Tried directly with olllama
using gemma
, same error. debug-on-error
is set but not catching any errors.
#s(llm-ollama "http" "localhost" 11434 "gemma:latest" "nomic-embed-text:latest")
@oatmealm try to change your config to:
(setopt ellama-provider
(make-llm-openai-compatible
:url "http://localhost:4000/v1/"
:key "some=key"
:chat-model "some-mode"
:embedding-model "some-embeding-model"))
I'm seeing this error when trying to translate selection. API calls are proxied via
litellm
so I can see there was no outgoing call. Seems to be happening when content is parse before the API call.... all other functions work