Open Loner1024 opened 3 months ago
I was trying to add Groq but could not, guess still this feature is not available.
Hi! I wanted to test Groq and Cerebras in Zed (both offer OpenAI Compatible API) and I assumed that it would be easy, since OpenAI is already supported. However, there's no configuration option for an OpenAI connection string as far as I can tell. Could you please add this configuration option?
@susl we were able to get it set up for our OpenAI compatible endpoint at Crosshatch. We put instructions here, which you can tweak to make it point to Groq, etc.:
https://docs.crosshatch.app/integrations/using-crosshatch-in-zed
But essentially something like this:
"language_models": {
"openai": {
"version": "1",
"api_url": "<BASE_URL>",
"available_models": [
{
"name": "<MODEL_NAME>",
"max_tokens": 10000
}
]
}
},
"assistant": {
"provider": null,
"default_model": {
"provider": "openai",
"model": "<MODEL-NAME>"
},
"version": "2"
}
@venables I was able to make Groq work using this method.
@venables Thanks! That worked for me. Will try Crosshatch now too :) Switching between multiple OpenAI API compatible models is a mess, but it works for now.
But essentially something like this:
just adding the "language_models" setting suffices; in fact, having the "assistant" setting disables using any other model. When I remove that, then I can switch between many models, including "OpenAI" models pointing to groq:
"language_models": {
"openai": {
"version": "1",
"api_url": "https://api.groq.com/openai/v1",
"low_speed_timeout_in_seconds": 600,
"available_models": [
{
"name": "llama-3.1-8b-instant",
"max_tokens": 10000
},
{
"name": "llama-3.1-70b-versatile",
"max_tokens": 32000
},
{
"name": "gemma2-9b-it",
"max_tokens": 6000
}
]
}
}
Check for existing issues
Describe the feature
zed currently offers support for a limited number of AI assistants, but many model providers offer OpenAI-compatible interfaces, e.g., DeepSeek, and there are a number of projects that offer OpenAI-compatible interfaces for a wider range of models. There are also projects that provide compatibility layers for more models, e.g. simple-one-api. Users in some regions may need to use the OpenAI interface through a proxy service provider, e.g. users in China cannot pay for the OpenAI interface. Therefore, it is necessary to provide a configurable generic OpenAI interface access, which will bring more powerful AI assistant capabilities to zed. In this feature, users can customize the endpoint and API key of the OpenAI API, and users can configure multiple custom model providers. I'd love to be involved in zed's development of this feature!
If applicable, add mockups / screenshots to help present your vision of the feature