karthink / gptel

A simple LLM client for Emacs
GNU General Public License v3.0
1.03k stars 111 forks source link

Models maybe shouldn't be a preset? #290

Closed Ypot closed 2 months ago

Ypot commented 2 months ago

Hi

I wonder long ago, why is it not possible to add any model by hand? The gptel author now has to add manually every option, like "turbo-2024-04-09".

Could it be possible to let the user add any model?

Best regards, I keep using gptel as my favourit LLM interface :-D

karthink commented 2 months ago

This is a trade-off between "works out of the box" behavior and "configure it yourself". Either I handle the ChatGPT backend set up, or users do.

If you look at the instructions in the README for most other backends, you'll see that it's up to the user to include the models they want.

I'm making the assumption that most gptel users are using ChatGPT, and trying to avoid making them add extra code to their init files. If they are using something more niche (like Groq or Perplexity), then I assume that they're comfortable with extra configuration.

Could it be possible to let the user add any model?

This is already possible. You can define an OpenAI backend yourself, like you do for the other backends:

(gptel-make-openai
   "ChatGPT"
   :key gptel-api-key
   :stream t
   :models '("gpt-3.5-turbo" "gpt-3.5-turbo-16k" "gpt-4"
             "gpt-4-turbo-preview" "gpt-4-32k" "gpt-4-1106-preview"
             "gpt-4-0125-preview"))

OR You can also modify the default OpenAI backend that gptel uses:

(setf (gptel-backend-models gptel--openai)
      (cons "gpt-4-turbo" (gptel-backend-models gptel--openai)))
Ypot commented 2 months ago

Perfect, thanks!