jackMort / ChatGPT.nvim

ChatGPT Neovim Plugin: Effortless Natural Language Generation with OpenAI's ChatGPT API
Apache License 2.0
3.56k stars 307 forks source link

FR: use litellm for easy support of mistral, anthropic, openrouter, ollama, huggingface etc #401

Open thiswillbeyourgithub opened 4 months ago

thiswillbeyourgithub commented 4 months ago

Hi,

I've been using litellm for a while now, it's a python library that enables using pretty much any API you can want for LLMs (Mistral, openrouter, localai, huggingface, azure, anthropic, etc).

And they support async too!

I think it would be nice to avoid being too reliant on OpenAI vs other providers.

Is that something that could be done ?

TC72 commented 4 months ago

I came here to see if Claude support was planned as their latest model is reported to be good for coding. Something like this would be great and it would take the pressure off this project to support more models directly.

TC72 commented 3 months ago

I managed to get it to work by just setting the api_host_cmd. This was working with Claude Sonnet and litellm is running in a docker container.

require("chatgpt").setup(
    {
        api_host_cmd = 'echo http://127.0.0.1:4000'
    }
)

I allowed it to use the default gpt-3.5-turbo config in the plugin, here is my litellm proxy_server_config.yaml model_list:

model_list:
#  - model_name: claude-sonnet
  - model_name: gpt-3.5-turbo
    litellm_params:
      model: claude-3-opus-20240229
      api_base: https://api.anthropic.com/v1/messages
      api_key: "USE_YOUR_API_KEY"

CleanShot 2024-03-12 at 21 02 40

This was just a quick n dirty test to make sure it could work. Next I'll add some more interesting models to the list like local Ollama models and use the config to switch between models. After that's working maybe we could have a way to pass the specific model to use as an option to plugin calls to switch models as we want?

Gogotchuri commented 3 months ago

I would love Claude support, are you working on litellm integration?

TC72 commented 3 months ago

I would love Claude support, are you working on litellm integration?

If you follow what I did it already works with Claude through litellm.

Aman9das commented 3 months ago

ogpt.nvim is a derivative plugin with support for other api

thiswillbeyourgithub commented 3 months ago

For mistral, here's my config.yaml:

model_list:
  - model_name: gpt-4-0125-preview
    litellm_params:
      model: mistral/mistral-large-latest
      api_key: REDACTED

litellm_settings:
  drop_params: True

Launch the proxy with litellm --config config.yaml --port 5000

Add this to your chatgpt.nvim config: api_host_cmd = "echo http://0.0.0.0:5000",

I seem to be having issues with their docker though

TC72 commented 3 months ago

@thiswillbeyourgithub Have you tried using docker compose? That's what worked for me. I created a simple docker folder with their docker-compose.yaml and my own proxy_server_config.yaml and it works great.

thiswillbeyourgithub commented 3 months ago

I didn't try docker compose and wanted to test their docker run directly. I'll get to it someday thanks