emacs-openai / chatgpt

Use ChatGPT inside Emacs
GNU General Public License v3.0
185 stars 21 forks source link

400 - bad request #20

Closed r2evans closed 1 year ago

r2evans commented 1 year ago

I've set up my openai api key, but responses fail with

400 - Bad request.  Please check error message and your parameters

This is all I get even after setting (setq openai--show-log t) per https://github.com/emacs-openai/openai#-debugging.

I've tried with a "normal" openai-user userid as well as after (setq openai-user nil), no change.

chatgpt-info gives something like

session: *ChatGPT: <0>*
history size: 0

prompt_tokens: 0 | completion_tokens: 0 | total_tokens: 0

model: gpt-3.5-turbo
max_tokens: 2000
temprerature: 1.0
top-p: 1.0
user:nil

(I have to transcribe that ... don't know how to get that text directly in the minibuffer, it's not in *Messages* and I can't find another buffer with that text.)

emacs-29.1, ubuntu-23.04, chatgpt-20230623.658

jcs090218 commented 1 year ago

Normally, the error message will be shown with prefix [ERROR]. See https://github.com/emacs-openai/openai/blob/7413b73993deb8c60730057f0219d4aec681c119/openai.el#L154.

The error message should be in JSON format, and it should be inthe *Messages* buffer. 🤔

r2evans commented 1 year ago

okay, facepalm, thank you.

Here's that output:

[ENCODED]: {"model":"gpt-3.5-turbo","messages":[{"role":null,"content":"say this is a test!"},{"role":"user","content":"say this is a test!"}],"temperature":1.0,"top_p":1.0,"max_tokens":2000,"user":"user"}
[error] request--callback: peculiar error: 400
[ERROR]: #s(request-response 400 nil ((error (message . None is not of type 'string' - 'messages.0.role') (type . invalid_request_error) (param) (code))) (error http 400) error https://api.openai.com/v1/chat/completions nil (:error #<subr F616e6f6e796d6f75732d6c616d626461_anonymous_lambda_11> :type POST :params nil :headers ((Content-Type . application/json) (Authorization . Bearer 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)) :data {"model":"gpt-3.5-turbo","messages":[{"role":null,"content":"say this is a test!"},{"role":"user","content":"say this is a test!"}],"temperature":1.0,"top_p":1.0,"max_tokens":2000,"user":"user"} :parser json-read :complete #[128 \301\302"A@\300!\207 [#[257 \300\205\0\305\300A!\211\205\0\306!\211\205/\0rq\210\307\310\311!\210?\205.\0\312!\210\313 \210\314!*\207 [(0 . *ChatGPT: <0>*) inhibit-read-only chatgpt-requesting-p chatgpt-spinner openai-error get-buffer buffer-live-p t nil spinner-stop chatgpt--add-response-messages chatgpt--display-messages chatgpt--add-tokens] 5 

(fn DATA)] plist-member :data] 4 

(fn &key DATA &allow-other-keys)] :url https://api.openai.com/v1/chat/completions :response #0 :encoding utf-8) #<killed buffer> HTTP/2 400 
date: Fri, 01 Sep 2023 17:41:32 GMT
content-type: application/json
content-length: 161
access-control-allow-origin: *
openai-organization: user-femlzh42qrdevajkebsfa6ly
openai-processing-ms: 5
openai-version: 2020-10-01
strict-transport-security: max-age=15724800; includeSubDomains
x-ratelimit-limit-requests: 3500
x-ratelimit-limit-tokens: 90000
x-ratelimit-remaining-requests: 3499
x-ratelimit-remaining-tokens: 87988
x-ratelimit-reset-requests: 17ms
x-ratelimit-reset-tokens: 1.341s
x-request-id: f5a8b4349bfc74dcecebfa4a499c3e33
cf-cache-status: DYNAMIC
server: cloudflare
cf-ray: 7fff4dffdaa81756-IAD
alt-svc: h3=":443"; ma=86400
 nil curl)
400 - Bad request.  Please check error message and your parameters
r2evans commented 1 year ago

(... and now that apikey has been revoked. Sigh. I'll clean the output better next time. I'm really glad they scan github comments for that, though ... that's kinda creepy ... it is AI after all ;-)

r2evans commented 1 year ago

Might I suggest that for inattentive/distracted dolts like me, you add two comments to the debugging session to the effect of:

  • check *Messages* for lines that start with [ERROR]; and
  • verify then scrub your API key from this text before sharing, it is shown in plaintext
jcs090218 commented 1 year ago

[ERROR]: #s(request-response 400 nil ((error (message . None is not of type 'string' - 'messages.0.role')

This part is what you want to look for! ;)

(message . None is not of type 'string' - 'messages.0.role')

Might I suggest that for inattentive/distracted dolts like me, you add two comments to the debugging session to the effect of:

Good idea! Would you like to open a PR for this? Thank you! 🚀

r2evans commented 1 year ago

I saw the "None" part, but what should role be? I don't see mention of it in the docs.

jcs090218 commented 1 year ago

I saw the "None" part, but what should role be? I don't see mention of it in the docs.

It should simply be something like:

[ENCODED]: {"model":"gpt-3.5-turbo","messages":[{"role":"user","content":"Hello!"}],"temperature":1.0,"top_p":1.0,"max_tokens":2000,"user":"user"}

I don't know why you sent " two " conversations in the vector (it should just be one). My suggestion is to check your configuration. It seems like there is something strange going on. 🤔

r2evans commented 1 year ago

I did not explicitly/intentionally send two requests, though that window did have consecutive attempts (with the same test question), so that might be it, I really don't know. I've not mucked around with it enough to know how to do something like that.

It now works, not sure how. I tried, got a 401, put in my new apikey (sigh) as expected, tried the same thing again, and it worked.

Thanks for the conversation!

jcs090218 commented 1 year ago

Awesome! I'm glad you resolved it! 🚀 😄