ahyatt / llm

A package abstracting llm capabilities for emacs.
GNU General Public License v3.0
142 stars 19 forks source link

Using make-llm-openai-compatible with Azure OpenAI service fails to connect to endpoint #36

Closed KaiHa closed 3 months ago

KaiHa commented 3 months ago

I configured the Azure OpenAI service (https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#completions) as openai-compatible provider and ran into the following two issues:

Maybe you find some clean solutions for this. I could also give it a try, if you point me into the preferred direction.

ahyatt commented 3 months ago

For the first issue, you should be using the Open AI-compatible provider, which does not expect an API key and allows you to customize the URL. So (setq ellama-provider (make-llm-openai-compatible :url "http://my-endpoint/chat/completions?api_version=2023-05-15")

For the second issue, I don't think there's a good solution to this, but I think I can make a way to make things easier to extend for you. Stay tuned, I'll write something soon.

KaiHa commented 3 months ago

Sorry, I meant api-version, not -key (I fixed my above comment).

Isn't the complete URL constructed in the llm-chat* functions[1][2][3]? And if yes, wouldn't that mean that when I follow your proposal I would end up with a request to

http://my-endpoint/chat/completions?api_version=2023-05-15/chat/completions
                                                          ^^^^^^^^^^^^^^^^^

which isn't what was intended?

[1] https://github.com/ahyatt/llm/blob/144edd99441314a88294e9c3457a42cc8351d564/llm-openai.el#L228 [2] https://github.com/ahyatt/llm/blob/144edd99441314a88294e9c3457a42cc8351d564/llm-openai.el#L248 [3] https://github.com/ahyatt/llm/blob/144edd99441314a88294e9c3457a42cc8351d564/llm-openai.el#L344

ahyatt commented 3 months ago

I see, yes you are correct. The url only is controlling the hostname, not the path or arguments. Let me think of a way this could be better configured.

ahyatt commented 3 months ago

I've checked in a change that should help, please try it out and let me know what you think. The idea is that you can create your own provider. This already is the case and will solve the URL problem. I've also made the headers now overrideable, which will solve your header problem. I'm sure there's many other things that could change for Open AI compatible providers, but for now, configuration would be something like:

(cl-defstruct llm-openai-azure (openai-compatible))

(cl-defmethod llm-openai--headers ((provider llm-openai-azure))
  `(("api-key" . ,(format "%s" (llm-openai-key provider)))))

(cl-defmethod llm-openai--url ((provider llm-openai-azure) command)
  (concat "https://azure-url/" command "?api-version=2023-05-15/")

Note I haven't tried this out. Also, if this isn't a good enough solution, or something is wrong, please re-open the bug.

r0man commented 3 months ago

Thanks for making headers a generic function. It is also useful for my custom provider.

KaiHa commented 3 months ago

Nice! Works for me with only minor modifications to the example you have given.

- (cl-defstruct llm-openai-azure (openai-compatible))
+ (cl-defstruct (llm-openai-azure (:include llm-openai-compatible)))

 (cl-defmethod llm-openai--url ((provider llm-openai-azure) command)
-   (concat "https://azure-url/" command "?api-version=2023-05-15/")
+   (concat "https://azure-url/" command "?api-version=2023-05-15")
ahyatt commented 3 months ago

Cool, I'll release this in a new version soon!