karthink / gptel

A simple LLM client for Emacs
GNU General Public License v3.0
1.04k stars 113 forks source link

New gpt-4-0125-preview model integration #197

Closed AlbertusMagnus7 closed 5 months ago

AlbertusMagnus7 commented 5 months ago

First of all, thank you for an amazing project. It changed my whole workflow and the resulting environment is pure efficiency magic and also makes Emacs shine brightly.

After updating gptel to the latest version, I cannot select gpt-4-0125-preview (nor the prospectively preferred gpt-4-turbo-preview, the new placeholder-name for the then latest model) as the model to use. It falls back to gpt-3.5-turbo instead when I do. Gpt-4-1106-preview, however, does work as expected.

I think it functioned before I updated gptel recently (which I had not in a while), but I am not sure about it –– I might just have overlooked it before.

karthink commented 5 months ago

First of all, thank you for an amazing project. It changed my whole workflow and the resulting environment is pure efficiency magic and also makes Emacs shine brightly.

Thanks!

From running curl https://api.openai.com/v1/models, I see that the following gpt-* models are supported:

"gpt-3.5-turbo-0301"
"gpt-4-0125-preview"
"gpt-4-turbo-preview"
"gpt-4-1106-preview"
"gpt-3.5-turbo"
"gpt-4"
"gpt-4-0613"
"gpt-4-vision-preview"
"gpt-3.5-turbo-0613"
"gpt-3.5-turbo-16k-0613"
"gpt-3.5-turbo-1106"
"gpt-3.5-turbo-instruct"
"gpt-3.5-turbo-instruct-0914"
"gpt-3.5-turbo-16k"

Ignoring the "vision" and the "instruct" models, do you know if all the others are valid models for chat right now? I can add them to the OpenAI API configuration in gptel.


I think it functioned before I updated gptel recently (which I had not in a while), but I am not sure about it –– I might just have overlooked it before.

This is because the models are now part of a gptel-backend for each LLM provider. You can create one for OpenAI with whatever models you need (that are supported by the API), with

(gptel-make-openai
   "ChatGPT"
   :key 'gptel-api-key
   :stream t
   :models '("gpt-3.5-turbo" "gpt-3.5-turbo-16k"
             "gpt-4" "gpt-4-1106-preview"))

NOTE: You won't need this or any extra configuration once I add the latest models to gptel's default OpenAI backend.

AlbertusMagnus7 commented 5 months ago

(gptel-make-openai "ChatGPT" :key 'gptel-api-key :stream t :models '("gpt-3.5-turbo" "gpt-3.5-turbo-16k" "gpt-4" "gpt-4-1106-preview"))

That's useful, thank you!

Upon looking up the mentioned models (here: https://platform.openai.com/docs/models/overview), they all (ignoring vision/instruct) seem valid chat models, yes.

However, the following models will be deprecated soon (June/2024):

karthink commented 5 months ago

Okay, the list of models is the following:

"gpt-3.5-turbo"
"gpt-3.5-turbo-0301"
"gpt-3.5-turbo-0613"
"gpt-3.5-turbo-1106"
"gpt-3.5-turbo-16k"
"gpt-3.5-turbo-16k-0613"
"gpt-4"
"gpt-4-0125-preview"
"gpt-4-0613"
"gpt-4-1106-preview"
"gpt-4-turbo-preview"

I think this is way too many and will confuse the user. Based on the link you provided, I've limited the default model selection to

"gpt-3.5-turbo"
"gpt-3.5-turbo-16k"
"gpt-4"
"gpt-4-turbo-preview"
"gpt-4-32k"
"gpt-4-1106-preview"

This covers all current (non-snapshot) models including gpt-4-0125-preview via gptel-4-turbo-preview. If you need the other models, you can use a custom gptel-backend (like above), OR add the extra model(s) to the existing backend like so:

(cl-callf
    (lambda (models extra) (setq models (append models extra)))
    (gptel-backend-models gptel--openai)
  '("gpt-4-0613" "gpt-3.5-turbo-16k-0613"))