TheR1D / shell_gpt

A command-line productivity tool powered by AI large language models like GPT-4, will help you accomplish your tasks faster and more efficiently.
MIT License
9.61k stars 761 forks source link

HTTPError "Bad request" when I try to pipe a json file to sgpt #271

Closed trbedwards closed 10 months ago

trbedwards commented 1 year ago

I wanted to use sgpt to list the OpenAI models available to me (a bit meta, I know!). I directly queried the OpenAI API to ask for the list of models, which I piped to a json file. I then attempted to pipe this json file to sgpt, asking it to summarise by printing the list of model IDs. However, when I try this, I get an HTTPError.

Here's how to reproduce the problem:

curl https://api.openai.com/v1/models -H "Authorization: Bearer $OPENAI_API_KEY" > output.json
cat output.json | sgpt "Please print the list of model ids"

I get the following error:

HTTPError: 400 Client Error: Bad Request for url: https://api.openai.com/v1/chat/completions

I tried the sgpt command with and without the -s, but I get the same error. I've confirmed that sgpt is still working normally by running sgpt "mass of sun" for example, and that works fine.

lalon commented 1 year ago

How many tokens are in output.json? The same issue has occurred to me with a large JSON (over 4k tokens).

00-Python commented 1 year ago

run cat output.json | sgpt --model gpt-3.5-turbo-16k "Please print the list of model ids" and it should work. That JSON file has approximately 8940 tokens, to much for the gpt-3.5-turbo model that can only take 4k tokens.

TheR1D commented 10 months ago

Closing this issue due to its age and lack of similar reports/requests from other users.