zhu327 / gemini-openai-proxy

A proxy for converting the OpenAI API protocol to the Google Gemini Pro protocol.
MIT License
504 stars 92 forks source link

如何在chatgpt-nextweb使用? #1

Closed hallfay0 closed 6 months ago

hallfay0 commented 6 months ago

已经用docker安装完成,使用 http://192.168.1.1:8080/v1等等地址, 加上geminipro api无法使用

zhu327 commented 6 months ago

I tested it myself on the chatbox and it worked by directly inputting http://127.0.0.1:8080 along with the Google AI Studio API key. Please ensure that generativelanguage.googleapis.com is accessible.

dhlsam commented 6 months ago

部署后再lobe使用成功,但是问话gemnin api 调用的是PaLM-2模型

zhu327 commented 6 months ago

部署后再lobe使用成功,但是问话gemnin api 调用的是PaLM-2模型

Gemini Pro就这德行,你可以到v2ex上看下,有人甚至问出来是文心一言

hallfay0 commented 6 months ago

奇怪,我的就是不行 "endpoint": "http://192.168.3.25:17777", "error": { "cause": { "name": "ConnectTimeoutError", "code": "UND_ERR_CONNECT_TIMEOUT", "message": "Connect Timeout Error" } } }

zhu327 commented 6 months ago

I have tested chatbox and OpenAI Translator, and also tried Pal Chat app on iOS, all of them work well. The project itself does not support the legacy OpenAI API, so any applications using the legacy OpenAI API would not be compatible.

zhu327 commented 6 months ago

奇怪,我的就是不行 "endpoint": "http://192.168.3.25:17777", "error": { "cause": { "name": "ConnectTimeoutError", "code": "UND_ERR_CONNECT_TIMEOUT", "message": "Connect Timeout Error" } } }

Perhaps you need a VPN to access generativelanguage.googleapis.com.

hallfay0 commented 6 months ago

奇怪,我的根本不行 "endpoint": " http://192.168.3.25:17777", "error": { "cause": { "name": "ConnectTimeoutError", "code": "UND_ERR_CONNECT_TIMEOUT", "message ": "连接超时错误" } } }

也许您需要 VPN 才能访问generativelanguage.googleapis.com。

我在洛杉矶,没墙。。。

zhu327 commented 6 months ago

奇怪,我的根本不行 "endpoint": " http://192.168.3.25:17777", "error": { "cause": { "name": "ConnectTimeoutError", "code": "UND_ERR_CONNECT_TIMEOUT", "message ": "连接超时错误" } } }

也许您需要 VPN 才能访问generativelanguage.googleapis.com。

我在洛杉矶,没墙。。。

You can locally build a file, launch it, and see what the input content is.

zhu327 commented 6 months ago

Alternatively, you can check the standard output content of the Docker container. I will reopen the issue.

hallfay0 commented 6 months ago

I am encountering this issue because I am just beginning to learn LINUX, and I am not sure about the problem. I will continue to investigate and if I can resolve it on my own, I will let you know. The following is the Docker log.
[GIN-debug] [WARNING] Creating an Engine instance with the Logger and Recovery middleware already attached.

[GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production.

[GIN-debug] GET /v1/models --> github.com/zhu327/gemini-openai-proxy/api.ModelsHandler (4 handlers) [GIN-debug] POST /v1/chat/completions --> github.com/zhu327/gemini-openai-proxy/api.ChatProxyHandler (4 handlers) [GIN-debug] [WARNING] You trusted all proxies, this is NOT safe. We recommend you to set a value. Please check https://pkg.go.dev/github.com/gin-gonic/gin#readme-don-t-trust-all-proxies for details. [GIN-debug] Listening and serving HTTP on :8080 [GIN] 2023/12/17 - 13:47:15 | 404 | 1.103µs | 192.168.3.239 | GET "/"

zhu327 commented 6 months ago

@hallfay0 I tried adding the route for the homepage and handling CORS errors. You can give it another try.

zhu327 commented 6 months ago

Confirm that ChatGPT NextWeb is compatible.