ChatGPTNextWeb / ChatGPT-Next-Web

A cross-platform ChatGPT/Gemini UI (Web / PWA / Linux / Win / MacOS). 一键拥有你自己的跨平台 ChatGPT/Gemini 应用。
https://app.nextchat.dev/
MIT License
73.18k stars 58.1k forks source link

[Bug] cannot link Ollama local serve #4219

Open lucksufe opened 4 months ago

lucksufe commented 4 months ago

Bug Description

cannot link Ollama local serve. Ollama and ChatNext are both latest version. I can run get Ollama response from python script,so the server is OK.

Steps to Reproduce

微信图片_20240305174858

Expected Behavior

微信截图_20240305174824

Screenshots

No response

Deployment Method

Desktop OS

win10

Desktop Browser

No response

Desktop Browser Version

No response

Smartphone Device

No response

Smartphone OS

No response

Smartphone Browser

No response

Smartphone Browser Version

No response

Additional Logs

No response

lucksufe commented 4 months ago

[GIN] 2024/03/05 - 21:34:14 | 403 | 0s | 192.168.31.22 | OPTIONS "/v1/chat/completions" [GIN] 2024/03/05 - 21:34:18 | 403 | 0s | 192.168.31.22 | OPTIONS "/v1/chat/completions" [GIN] 2024/03/05 - 21:36:58 | 403 | 0s | 192.168.31.22 | OPTIONS "/dashboard/billing/usage?start_date=2024-03-01&end_date=2024-03-06" [GIN] 2024/03/05 - 21:36:58 | 403 | 0s | 192.168.31.22 | OPTIONS "/dashboard/billing/subscription"

Ollama log shows 403 for nextchat requests

Alias4D commented 4 months ago

error connect to local ollama server

image

image

image

any fix , please

H0llyW00dzZ commented 4 months ago

This is the reason why Ollama is still not stable or fully compatible with this repository, particularly for desktop use. The owner released it without comprehensive testing across various operating systems.

lucksufe commented 4 months ago

Referrer Policy: strict-origin-when-cross-origin

maybe caused by this policy,but I already create my user variables. OLLAMA_ORIGINS=*://localhost OLLAMA_HOST=0.0.0.0 under the below instructions.

`Setting environment variables on Windows On windows, Ollama inherits your user and system environment variables.

First Quit Ollama by clicking on it in the task bar

Edit system environment variables from the control panel

Edit or create New variable(s) for your user account for OLLAMA_HOST, OLLAMA_MODELS, etc.

Click OK/Apply to save

Run ollama from a new terminal window`

H0llyW00dzZ commented 4 months ago

Referrer Policy: strict-origin-when-cross-origin

maybe caused by this policy,but I already create my user variables. OLLAMA_ORIGINS=*://localhost OLLAMA_HOST=0.0.0.0 under the below instructions.

`Setting environment variables on Windows On windows, Ollama inherits your user and system environment variables.

First Quit Ollama by clicking on it in the task bar

Edit system environment variables from the control panel

Edit or create New variable(s) for your user account for OLLAMA_HOST, OLLAMA_MODELS, etc.

Click OK/Apply to save

Run ollama from a new terminal window`

it still doesn't work now ?

H0llyW00dzZ commented 4 months ago

Also I don't think so because of strict-origin-when-cross-origin

The strict-origin-when-cross-origin policy strikes a balance between security/privacy and functionality. Here's how it works:

If you think because of strict-origin-when-cross-origin then ollama it's fucking bad

lucksufe commented 4 months ago

Referrer Policy: strict-origin-when-cross-origin maybe caused by this policy,but I already create my user variables. OLLAMA_ORIGINS=*://localhost OLLAMA_HOST=0.0.0.0 under the below instructions. Setting environment variables on Windows On windows, Ollama inherits your user and system environment variables. First Quit Ollama by clicking on it in the task bar Edit system environment variables from the control panel Edit or create New variable(s) for your user account for OLLAMA_HOST, OLLAMA_MODELS, etc. Click OK/Apply to save Run ollama from a new terminal window

it still doesn't work now ?

still 403 forbidden, I also copy and paste the post contents and headers from ChatGPT-Next-Web to python, and it works. The only difference I can see is “Referrer Policy: strict-origin-when-cross-origin” in ChatGPT-Next-Web post request.

I change to llama.cpp to run a serve, Ollama deleted.

fred-bf commented 4 months ago

@lucksufe according to you ollama logs it seems to be the NextChat's requests being blocked by the CORS policies. It looks like the env you've set havent take effect in your ollama instance

kaikanertan commented 4 months ago

I have create the env _OLLAMAORIGINS to *://localhost but still meet 403, my system is windows 10.

Jackxwb commented 4 months ago

ollama API May have been modified. (my ollama version is 0.1.28) I copied the request from Chrome browser to third-party software in a curl format, and Ollama returned 404

curl 'http://localhost:11434/api/v1/chat/completions' \
  -H 'sec-ch-ua: "Chromium";v="122", "Not(A:Brand";v="24", "Microsoft Edge";v="122"' \
  -H 'DNT: 1' \
  -H 'sec-ch-ua-mobile: ?0' \
  -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 Edg/122.0.0.0' \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json, text/event-stream' \
  -H 'Referer;' \
  -H 'sec-ch-ua-platform: "Windows"' \
  --data-raw '{"messages":[{"role":"user","content":"你好呀"}],"stream":true,"model":"llava:latest","temperature":0.5,"presence_penalty":0,"frequency_penalty":0,"top_p":1}'

PixPin_2024-03-07_23-00-19

Referring to other webui (ollama-webui-lite), it uses the following API for communication

http://localhost:11434/api/tags
http://localhost:11434/api/version

http://localhost:11434/api/chat
http://localhost:11434/api/generate
curl 'http://localhost:11434/api/chat' \
  -H 'Accept: */*' \
  -H 'Accept-Language: zh-CN,zh;q=0.9,en;q=0.8,en-GB;q=0.7,en-US;q=0.6' \
  -H 'Connection: keep-alive' \
  -H 'Content-Type: text/event-stream' \
  -H 'DNT: 1' \
  -H 'Origin: http://localhost:3001' \
  -H 'Referer: http://localhost:3001/' \
  -H 'Sec-Fetch-Dest: empty' \
  -H 'Sec-Fetch-Mode: cors' \
  -H 'Sec-Fetch-Site: same-site' \
  -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 Edg/122.0.0.0' \
  -H 'sec-ch-ua: "Chromium";v="122", "Not(A:Brand";v="24", "Microsoft Edge";v="122"' \
  -H 'sec-ch-ua-mobile: ?0' \
  -H 'sec-ch-ua-platform: "Windows"' \
  --data-raw '{"model":"llava:latest","messages":[{"role":"user","content":"你好呀"},{"role":"assistant","content":""}],"options":{}}'
fred-bf commented 4 months ago

@Jackxwb Please ensure your Ollama version is greater than v0.1.24 https://docs.nextchat.dev/models/ollama and the endpoint you configured is http://localhost:11434/ it seems you added additionally path /api/

Jackxwb commented 4 months ago

@Jackxwb Please ensure your Ollama version is greater than v0.1.24 https://docs.nextchat.dev/models/ollama and the endpoint you configured is http://localhost:11434/ it seems you added additionally path /api/

Thank you for your reminder. After modification, I copied the requests from the Chrome browser and now they can work in third-party debugging tools. But there is still an error in the browser. I set this configuration to the window program, but it still cannot be used image

image In the window program, I cannot see the error log

-------- 2024-03-08 16:32(UTC+8) -------- I am using Edge browser, and after adding --disable-web-security to the shortcut, it can be accessed in the browser, but the exe program still reports an error. Additionally, I found that images can be sent in the exe program, but there is no button for sending images on the web side

-------- 2024-03-08 21:47(UTC+8) -------- After adding OLLAMA_ORIGINS=* to the system environment and restarting the OLLAMA service, I can now access OLLAMA in both edge and exe on my computer On my Android phone, some browsers are accessible, while others are still not image image image

z2n commented 4 months ago

先设置一下OLLAMA_ORIGINS, 如果设置了依然无效, 那么可能是是请求的 header 问题 当你使用自定义接口时, 如果只设置了接口地址, 没有设置 API KEY, 那么每次请求时就会把 access code 放在 Authorization 传过去(这个可能还是一个安全问题?) 你可以在使用 ollama 时把 访问密码(access code) 清空来临时解决这个问题.或者等待这个 PR https://github.com/ollama/ollama/pull/2506

Issues-translate-bot commented 4 months ago

Bot detected the issue body's language is not English, translate it automatically.


Set OLLAMA_ORIGINS first. If it still doesn’t work, it may be a request header problem. When you use a custom interface, if you only set the interface address and do not set the API KEY, then the access code will be passed in Authorization every time you request it (this may still be a security issue?) You can temporarily solve this problem by clearing the access code when using ollama. Or wait for this PR https://github.com/ollama/ollama/pull/2506

aaa930811 commented 4 months ago

ollama还是使用不了,试过别的chatbox项目可以使用,所以应该不是ollama的配置问题

Issues-translate-bot commented 4 months ago

Bot detected the issue body's language is not English, translate it automatically.


Ollama still can't be used. I tried other chatbox projects and it works, so it shouldn't be a configuration problem with ollama.

aaa930811 commented 4 months ago

image image

Issues-translate-bot commented 4 months ago

Bot detected the issue body's language is not English, translate it automatically.


image image

mintisan commented 4 months ago

我也碰到了,我用 LobeChat 同样的地址设置是可以的

Issues-translate-bot commented 4 months ago

Bot detected the issue body's language is not English, translate it automatically.


I also encountered it. I used LobeChat and the same address settings were ok.

Alias4D commented 3 months ago

Problem solved for nextChat Just set variables Ollama host to 0.0.0.0 Ollama origins to * Set openai end point of 127.0.0.1:11434 Model name to same name of ollama list name

mcthesw commented 3 months ago

OLLAMA_ORIGINS=*works for me

1101728133 commented 2 months ago

None of your methods worked

daiaji commented 2 months ago

先设置一下 OLLAMA_ORIGINS, 如果设置了依然无效, 那么可能是是请求的 header 问题 当你使用自定义接口时, 如果只设置了接口地址, 没有设置 API KEY, 那么每次请求时就会把 access code 放在 Authorization 传过去(这个可能还是一个安全问题?) 你可以在使用 ollama 时把 访问密码(access code) 清空来临时解决这个问题.或者等待这个 PR ollama/ollama#2506

清空NextWeb的访问密码管用,模型名字为ollama list命令输出的那个。 不清空NextWeb的访问密码就不行,这是BUG吗?

Issues-translate-bot commented 2 months ago

Bot detected the issue body's language is not English, translate it automatically.


Set OLLAMA_ORIGINS first. If it is still invalid, it may be a request header problem. When you use a custom interface, if you only set the interface address and not the API KEY, then every request will be access code is passed in Authorization (this may still be a security issue?) You can temporarily solve this problem by clearing the access code (access code) when using ollama. Or wait for this PR ollama/ollama #2506

Clearing the NextWeb access password will work. The model name is the one output by the ollama list command.

playertk commented 2 months ago

I tried using Postman monitoring, and compared POST with OPTTONS. The Ollama server only supports POST responses and rejected OPTTONS requests

[GIN] 2024/05/10-10:16:25 | 200 | 8.5950196s | 127.0.0.1 | POST "/v1/chat/completion"
[GIN] 2024/05/10-10:16:02 | 404 | 0s | 127.0.0.1 | Options "/v1/chat/completion"

Is ChatGPTNextWeb configured to change the default access mode to POST?