Closed dszakallas closed 1 year ago
There is an outstanding PR which could do this / is the ideal place for this. (not just the 16k context gpt 3.5 but also the 32k context gpt 4)
While the PR is pending, you could also just run (setq gptel-model "gpt-3.5-turbo-16k")
and it should work.
The PR has been merged.
I would like to request support for the
gpt-3.5-turbo-16k
model.