rubberduck-ai / rubberduck-vscode

Use AI-powered code edits, explanations, code generation, error diagnosis, and chat in Visual Studio Code with the official OpenAI API.
https://marketplace.visualstudio.com/items?itemName=Rubberduck.rubberduck-vscode
MIT License
579 stars 70 forks source link

Ability to select the gpt-3.5-turbo-16k model #102

Closed anlek closed 9 months ago

anlek commented 11 months ago

Is this request related to a problem? Please describe.

I've only started playing with this extension, however, I seem to hit the max context length.

Error: This model's maximum context length is 4097 tokens. However, you requested 5003 tokens (3979 in the messages, 1024 in the completion). Please reduce the length of the messages or completion.

Describe the solution you'd like

I'd love to be able to use the 16k version (or support auto selecting the 16k if input is large)

Additional context

I've tried to force the setting for "rubberduck.model": "gpt-3.5-turbo-16k" however it errors saying I'm not allowed to select a model not on the provided list (gpt-3.5-turbo or gpt-4).

anlek commented 11 months ago

This might fix the issue reported on issue #92.

wilrodriguez commented 10 months ago

This might fix the issue reported on issue #92.

I don't think it's directly related. I believe #92 is caused by a race condition where the rubberduck ui panel seems to need to be in focus for things to generate correctly. Additionally, I have only been using gpt-4 and gpt-3.5-turbo in my configuration.

micahnz commented 9 months ago

I would also like to specify the exact model to use and do what to use it with gpt-3.5-turbo-16k but because the setting is a drop down it doesn't seem to let me type my own in even if I try to force it like above.

lgrammel commented 9 months ago

v1.17 supports gpt-4-32k and gpt-3.5-turbo-16k now.