cyberchitta / openai_ex

Community maintained Elixir library for OpenAI API
https://hexdocs.pm/openai_ex
Apache License 2.0
140 stars 19 forks source link

Allow interaction with other LLMs #67

Closed restlessronin closed 1 year ago

restlessronin commented 1 year ago

Describe the feature or improvement you're requesting

Basic Use Case. I want to use livebook to interact with a local LLM. the easiest way to do this is to use llama.cpp-python as a proxy server that converts openai api to other LLM models. In particular, to begin with, I am only interested in the chat_completion endpoint.

With this proxy, it appears that we only need to change the base url of the api, and we should be able to work with any LLM that is supported by llama.cpp

restlessronin commented 1 year ago

I have tested the new parameterized url and it does seem to work as expected at least for non-streaming completions and chat-completions.