bifrostlab / llm-assistant

Multifunctional LLM Assistant for Discord
8 stars 6 forks source link

Updated the documents and codes on how to use Litellm to create a unified interface for both ChatGPT and Ollama #10

Closed hungdtrn closed 8 months ago

hungdtrn commented 8 months ago

Initially, I considered using litellm to replace the OpenAI SDK for interacting with both ChatGPT and Ollama. However, as suggested by @phattantran1997 , we can improve upon this approach.

Let’s create a proxy server that wraps around both ChatGPT and Ollama. By doing this, we can reuse the OpenAI SDK to interact with this proxy server instead of the base OpenAI server. This approach is better because it allows us to integrate our Ollama with existing OpenAI applications without requiring any code changes

As about Docker, we need to setup the proxy server inside the Ollama container, and interact with this proxy server instead. @samhwang @nqngo

samhwang commented 8 months ago

please keep the pre-commit related changes out of this PR. It isn't related to the LiteLLM changes.

hungdtrn commented 8 months ago

please keep the pre-commit related changes out of this PR. It isn't related to the LiteLLM changes.

Thanks for reminding me, I have reverted back to the previous commit

samhwang commented 8 months ago

One final nitpick from me and it's good to go. @nqngo wanna give a gloss over?

hungdtrn commented 8 months ago

@phattantran1997 Please have a look at [llm_assistant/ollama/README.md] to have a better understanding on how the Litellm proxy server work.