Closed dmikushin closed 5 months ago
mine is also having this same issue did you founad a fix for that
You need to have paid account in order to user gpt-4
more details here. You might be able to use gpt-3.5-turbo
model with free account:
sgpt --model gpt-3.5-turbo "test"
Hi @TheR1D , no for me the problem is with a paid account. Please note: gpt-4 works, gpt-4-32k does not work.
Hii @TheR1D yeah i also tried that it's giving me another error (HTTPError: 429 Client Error: Too Many Requests for url: https://api.openai.com/v1/chat/completions )
same for me, I get error on - DEFAULT_MODEL=gpt-4-1106-preview
I tried gpt-3.5-turbo
but no cigar:
HTTPError: 404 Client Error: Not Found for url: https://api.openai.com/v1/chat/completions
At least shellgpt has beautiful error messages!
@TheR1D , it feels that we all have different permissions to access different models. I do have access to gpt-4, but not to the gpt-4-32k. No idea why, perhaps OpenAI has different access levels even for the paid accounts. What shell_gpt could do to address this issue is to display the full https://api.openai.com/v1/chat/completions response that explains the actual reason of error to everyone.
Same issue here:
HTTPError: 404 Client Error: Not Found for url: https://api.openai.com/v1/chat/completions
The default: gpt-4-1106-preview = error 404 gpt-3.5-turbo = error 429 Where is the setting for the models and any more i can try?
In the latest release these errors will come with description and possible solutions. Run pip install shell-gpt --upgrade
to install latest version.
Same problem after upgrade
I've solved this for myself by switching away from OpenAI to OpenRouter.
That is, I register an OpenRouter account and adjust ~/.config/shell_gpt/.sgptrc
accordingly:
OPENAI_API_HOST=https://api.openrouter.ai
OPENAI_API_KEY=<your_open_router_key_here>
OPENAI_BASE_URL=https://openrouter.ai/api/v1
I also make sure that the "default model" in OpenRouter settings is gpt4-32k
:
And now I finally get the gpt4-32k
working without an error!
So, the problem discussed here is all about OpenAI's inability to properly manage their users. OpenAI enables certain models for ones and disables for others, without any clear logic. OK then OpenRouter is a proxy, which works around it.
As mentioned (in other issues), If you have ChatGPT plus subscription you also need to buy credit here: https://platform.openai.com/account/billing/overview
THEN wait few minutes!
While the default model
gpt-4-1106-preview
works normally, I get the following error when trying to usegpt-4-32k-0613
model:What could be the issue here?