stevemolitor / robby

Extensible Emacs interface to OpenAI
MIT License
18 stars 2 forks source link

openai compatible provider #1

Open oatmealm opened 8 months ago

oatmealm commented 8 months ago

Hi there! Wondering if robby can setup to work with openapi compatible providers. Being able to setup a base_url for example...

stevemolitor commented 8 months ago

Oh interesting idea. It was hard-coded but I just added a commit to make the API url customizable - 34cc41331f628685df0bb48983bbfd8531e1127f.

To use put this in your init.el:

(setq robby-api-url "https://alternate-api-provider")

Let me know if that works. What provider are you using?

oatmealm commented 8 months ago

I'll try with torgetheai and ollama (which now supports some openai api directives). Thanks!

oatmealm commented 8 months ago

I'm trying a simple setup with litellm (openai proxy), but keep getting "405 Method Not Allowed":

(use-package! robby
  :commands (robby-chat)
  :bind ("C-c r" . robby-command-map)
  :custom
  (robby-api-url "http://localhost:4000")
  (robby-openai-api-key "sk-1234")
  (robby-chat-model "some-model"))

Otherwise this setup works out of the box with other packages I've been trying.

stevemolitor commented 8 months ago

Thanks! What other providers did you try?

I’m noticing slight variations between the different providers in auxiliary things like fetching and parsing the list of available models or error response formats. I started working on something to allow plugging in little variations like that for specific providers. It will be a week or two however before I get back to this. I’ll sort the ‘405’ response from litellm then.

oatmealm commented 8 months ago

I have only one cohere model plugged in so far via litellm.