zed-industries / zed

Code at the speed of thought – Zed is a high-performance, multiplayer code editor from the creators of Atom and Tree-sitter.
https://zed.dev
Other
50.63k stars 3.14k forks source link

Allow add custom AI assistant provider #15732

Open Loner1024 opened 3 months ago

Loner1024 commented 3 months ago

Check for existing issues

Describe the feature

zed currently offers support for a limited number of AI assistants, but many model providers offer OpenAI-compatible interfaces, e.g., DeepSeek, and there are a number of projects that offer OpenAI-compatible interfaces for a wider range of models. There are also projects that provide compatibility layers for more models, e.g. simple-one-api. Users in some regions may need to use the OpenAI interface through a proxy service provider, e.g. users in China cannot pay for the OpenAI interface. Therefore, it is necessary to provide a configurable generic OpenAI interface access, which will bring more powerful AI assistant capabilities to zed. In this feature, users can customize the endpoint and API key of the OpenAI API, and users can configure multiple custom model providers. I'd love to be involved in zed's development of this feature!

If applicable, add mockups / screenshots to help present your vision of the feature

CleanShot 2024-08-03 at 19 43 00@2x

dalpatsingh-jabsainfotech commented 3 months ago

I was trying to add Groq but could not, guess still this feature is not available.

susl commented 2 months ago

Hi! I wanted to test Groq and Cerebras in Zed (both offer OpenAI Compatible API) and I assumed that it would be easy, since OpenAI is already supported. However, there's no configuration option for an OpenAI connection string as far as I can tell. Could you please add this configuration option?

venables commented 2 months ago

@susl we were able to get it set up for our OpenAI compatible endpoint at Crosshatch. We put instructions here, which you can tweak to make it point to Groq, etc.:

https://docs.crosshatch.app/integrations/using-crosshatch-in-zed

But essentially something like this:

"language_models": {
  "openai": {
    "version": "1",
    "api_url": "<BASE_URL>",
    "available_models": [
      {
        "name": "<MODEL_NAME>",
        "max_tokens": 10000
      }
    ]
  }
},
"assistant": {
  "provider": null,
  "default_model": {
    "provider": "openai",
    "model": "<MODEL-NAME>"
  },
  "version": "2"
}
dalpatsingh-jabsainfotech commented 2 months ago

@venables I was able to make Groq work using this method.

susl commented 2 months ago

@venables Thanks! That worked for me. Will try Crosshatch now too :) Switching between multiple OpenAI API compatible models is a mess, but it works for now.

pchalasani commented 1 month ago

But essentially something like this:

just adding the "language_models" setting suffices; in fact, having the "assistant" setting disables using any other model. When I remove that, then I can switch between many models, including "OpenAI" models pointing to groq:


  "language_models": {
    "openai": {
      "version": "1",
      "api_url": "https://api.groq.com/openai/v1",
      "low_speed_timeout_in_seconds": 600,
      "available_models": [
        {
          "name": "llama-3.1-8b-instant",
          "max_tokens": 10000
        },        
        {
          "name": "llama-3.1-70b-versatile",
          "max_tokens": 32000
        },
        {
          "name": "gemma2-9b-it",
          "max_tokens": 6000
        }        
      ]
    }
  }