Closed 0rtz closed 6 months ago
This error means that the request parameter has an issue, you should try printing the request parameter
bs, _ := json.Marshal(req)
fmt.Printf(string(bs))
https://github.com/zhu327/gemini-openai-proxy/blob/main/api/handler.go#L68
{"model":"","messages":null,"max_tokens":0,"temperature":0,"top_p":0,"n":0,"stream":false}2024/01/01 17:04:05 genai get stream message error googleapi: Error 400:
[GIN] 2024/01/01 - 17:04:05 | 200 | 649.504851ms | ::1 | POST "/v1/chat/completions"
{"model":"","messages":null,"max_tokens":0,"temperature":0,"top_p":0,"n":0,"stream":false}2024/01/01 17:04:05 genai get stream message error googleapi: Error 400: [GIN] 2024/01/01 - 17:04:05 | 200 | 649.504851ms | ::1 | POST "/v1/chat/completions"
messages in request data cannot be empty, please confirm if client has bug, this issue is not caused by gemini openai proxy
Trying to use gemini-openai-proxy with the gpt-engineer I invoke gpt-engineer as following
OPENAI_API_KEY=1234 OPENAI_API_BASE=http://localhost:8080/v1 gpt-engineer Tic-tac-toe gpt-3.5-turbo
and get the following error on proxy sideNot sure if the problem is on proxy side or gpt-engineer side tho