Closed alaisgood closed 6 months ago
What model were you using on the chat page and knowledge base chat respectively? I will try and see if I can reproduce.
No sure is the same issue or not.
(base) xxx@p163:~/OneDrive/Productive/LLM/chat-ollama$ sudo docker compose up
WARN[0000] /home/coty/OneDrive/Productive/LLM/chat-ollama/docker-compose.yaml: version
is obsolete
[+] Running 1/1
✔ chatollama Pulled 2.9s
[+] Running 2/0
✔ Container chat-ollama-chromadb-1 Running 0.0s
✔ Container chat-ollama-chatollama-1 Running 0.0s
Attaching to chatollama-1, chromadb-1
chatollama-1 | Ollama: {
chatollama-1 | host: 'http://host.docker.internal:11434',
chatollama-1 | username: null,
chatollama-1 | password: null
chatollama-1 | }
chatollama-1 | Authorization: dW5kZWZpbmVkOnVuZGVmaW5lZA==
chatollama-1 | [nuxt] [request error] [unhandled] [500] fetch failed
chatollama-1 | at Object.fetch (node:internal/deps/undici/undici:11576:11)
chatollama-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
chatollama-1 | Ollama: {
chatollama-1 | host: 'http://host.docker.internal:11434',
chatollama-1 | username: null,
chatollama-1 | password: null
chatollama-1 | }
chatollama-1 | Authorization: dW5kZWZpbmVkOnVuZGVmaW5lZA==
chatollama-1 | [nuxt] [request error] [unhandled] [500] fetch failed
chatollama-1 | at Object.fetch (node:internal/deps/undici/undici:11576:11)
chatollama-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
![Uploading 截图 2024-03-22 07-19-06.png…]()
@lesong36 方便重新上传下截图吗?貌似图片没法显示。不知道是不是我这里的问题
What model were you using on the chat page and knowledge base chat respectively? I will try and see if I can reproduce.
Thanks for your response. I selected the "llama2:13b" and the "gemma:7b" as LLM, but received the same feedbacks regarding this failure.
What model were you using on the chat page and knowledge base chat respectively? I will try and see if I can reproduce.
Thanks for your response. I selected the "llama2:13b" and the "gemma:7b" as LLM, but received the same feedbacks regarding this failure.
Did Ollama server restart help?
@lesong36 方便重新上传下截图吗?貌似图片没法显示。不知道是不是我这里的问题
图片我也找不到了,内容是ollama可以在terminal中run起来的信息。 我试过"llama2:13b" , "gemma:7b",qianwen14,重启Ollama也没用
@lesong36 方便重新上传下截图吗?貌似图片没法显示。不知道是不是我这里的问题
图片我也找不到了,内容是ollama可以在terminal中run起来的信息。 我试过"llama2:13b" , "gemma:7b",qianwen14,重启Ollama也没用
有用redis吗?
昨天好像我遇到过一次同样问题,后来重启了ChatOllama开发服务器,就没法重现这个问题了。我会继续关注。你可以尝试拉取最新代码,重启ChatOllama,试试
What model were you using on the chat page and knowledge base chat respectively? I will try and see if I can reproduce.
Thanks for your response. I selected the "llama2:13b" and the "gemma:7b" as LLM, but received the same feedbacks regarding this failure.
Did Ollama server restart help?
it still doesn't work.
What model were you using on the chat page and knowledge base chat respectively? I will try and see if I can reproduce.
Thanks for your response. I selected the "llama2:13b" and the "gemma:7b" as LLM, but received the same feedbacks regarding this failure.
Did Ollama server restart help?
it looks a performance issue, when I curtail the uploaded document to round 200+ words(only 1 PageContent), I can receive the feedback from the LLM assistant .
@lesong36 方便重新上传下截图吗?貌似图片没法显示。不知道是不是我这里的问题
图片我也找不到了,内容是ollama可以在terminal中run起来的信息。 我试过"llama2:13b" , "gemma:7b",qianwen14,重启Ollama也没用
有用redis吗?
昨天好像我遇到过一次同样问题,后来重启了ChatOllama开发服务器,就没法重现这个问题了。我会继续关注。你可以尝试拉取最新代码,重启ChatOllama,试试
的确是有用redis,重拉重启Chatollama也没用
@sugarforever
base) coty@p16:~/OneDrive/Productive/LLM/chat-ollama$ sudo docker compose up
WARN[0000] /home/coty/OneDrive/Productive/LLM/chat-ollama/docker-compose.yaml: version
is obsolete
[+] Running 1/1
✔ chatollama Pulled 4.8s
[+] Running 2/0
✔ Container chat-ollama-chromadb-1 Created 0.0s
✔ Container chat-ollama-chatollama-1 Created 0.0s
Attaching to chatollama-1, chromadb-1
chromadb-1 | Starting 'uvicorn chromadb.app:app' with args: --workers 1 --host 0.0.0.0 --port 8000 --proxy-headers --log-config chromadb/log_config.yml --timeout-keep-alive 30
chatollama-1 | Listening on http://[::]:3000
chromadb-1 | DEBUG: [22-03-2024 11:28:22] Registering provider: token_config
chromadb-1 | DEBUG: [22-03-2024 11:28:22] Registering provider: user_token_config
chromadb-1 | DEBUG: [22-03-2024 11:28:22] Registering provider: token
chromadb-1 | DEBUG: [22-03-2024 11:28:22] Registering provider: token
chromadb-1 | WARNING: [22-03-2024 11:28:22] chroma_server_nofile is set to 65535, but this is less than current soft limit of 1048576. chroma_server_nofile will not be set.
chromadb-1 | INFO: [22-03-2024 11:28:22] Anonymized telemetry enabled. See https://docs.trychroma.com/telemetry for more information.
chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component System
chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component OpenTelemetryClient
chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component SimpleAssignmentPolicy
chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component SqliteDB
chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component QuotaEnforcer
chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component Posthog
chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component LocalSegmentManager
chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component SegmentAPI
chromadb-1 | INFO: [22-03-2024 11:28:22] Started server process [1]
chromadb-1 | INFO: [22-03-2024 11:28:22] Waiting for application startup.
chromadb-1 | INFO: [22-03-2024 11:28:22] Application startup complete.
chromadb-1 | INFO: [22-03-2024 11:28:22] Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
chatollama-1 | Ollama: { host: 'http://127.0.0.1:11434', username: null, password: null }
chatollama-1 | Ollama: {
chatollama-1 | host: 'http://host.docker.internal:11434',
chatollama-1 | username: null,
chatollama-1 | password: null
chatollama-1 | }
chatollama-1 | Authorization: dW5kZWZpbmVkOnVuZGVmaW5lZA==
chatollama-1 | [nuxt] [request error] [unhandled] [500] fetch failed
chatollama-1 | at Object.fetch (node:internal/deps/undici/undici:11576:11)
chatollama-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
chatollama-1 | Ollama: {
chatollama-1 | host: 'http://host.docker.internal:11434',
chatollama-1 | username: null,
chatollama-1 | password: null
chatollama-1 | }
chatollama-1 | Authorization: dW5kZWZpbmVkOnVuZGVmaW5lZA==
chatollama-1 | [nuxt] [request error] [unhandled] [500] fetch failed
chatollama-1 | at Object.fetch (node:internal/deps/undici/undici:11576:11)
chatollama-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
chatollama-1 | Ollama: {
chatollama-1 | host: 'http://host.docker.internal:11434',
chatollama-1 | username: null,
chatollama-1 | password: null
chatollama-1 | }
@sugarforever base) coty@p16:~/OneDrive/Productive/LLM/chat-ollama$ sudo docker compose up WARN[0000] /home/coty/OneDrive/Productive/LLM/chat-ollama/docker-compose.yaml:
version
is obsolete [+] Running 1/1 ✔ chatollama Pulled 4.8s [+] Running 2/0 ✔ Container chat-ollama-chromadb-1 Created 0.0s ✔ Container chat-ollama-chatollama-1 Created 0.0s Attaching to chatollama-1, chromadb-1 chromadb-1 | Starting 'uvicorn chromadb.app:app' with args: --workers 1 --host 0.0.0.0 --port 8000 --proxy-headers --log-config chromadb/log_config.yml --timeout-keep-alive 30 chatollama-1 | Listening on http://[::]:3000 chromadb-1 | DEBUG: [22-03-2024 11:28:22] Registering provider: token_config chromadb-1 | DEBUG: [22-03-2024 11:28:22] Registering provider: user_token_config chromadb-1 | DEBUG: [22-03-2024 11:28:22] Registering provider: token chromadb-1 | DEBUG: [22-03-2024 11:28:22] Registering provider: token chromadb-1 | WARNING: [22-03-2024 11:28:22] chroma_server_nofile is set to 65535, but this is less than current soft limit of 1048576. chroma_server_nofile will not be set. chromadb-1 | INFO: [22-03-2024 11:28:22] Anonymized telemetry enabled. See https://docs.trychroma.com/telemetry for more information. chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component System chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component OpenTelemetryClient chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component SimpleAssignmentPolicy chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component SqliteDB chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component QuotaEnforcer chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component Posthog chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component LocalSegmentManager chromadb-1 | DEBUG: [22-03-2024 11:28:22] Starting component SegmentAPI chromadb-1 | INFO: [22-03-2024 11:28:22] Started server process [1] chromadb-1 | INFO: [22-03-2024 11:28:22] Waiting for application startup. chromadb-1 | INFO: [22-03-2024 11:28:22] Application startup complete. chromadb-1 | INFO: [22-03-2024 11:28:22] Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit) chatollama-1 | Ollama: { host: 'http://127.0.0.1:11434', username: null, password: null } chatollama-1 | Ollama: { chatollama-1 | host: 'http://host.docker.internal:11434', chatollama-1 | username: null, chatollama-1 | password: null chatollama-1 | } chatollama-1 | Authorization: dW5kZWZpbmVkOnVuZGVmaW5lZA== chatollama-1 | [nuxt] [request error] [unhandled] [500] fetch failed chatollama-1 | at Object.fetch (node:internal/deps/undici/undici:11576:11) chatollama-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5) chatollama-1 | Ollama: { chatollama-1 | host: 'http://host.docker.internal:11434', chatollama-1 | username: null, chatollama-1 | password: null chatollama-1 | } chatollama-1 | Authorization: dW5kZWZpbmVkOnVuZGVmaW5lZA== chatollama-1 | [nuxt] [request error] [unhandled] [500] fetch failed chatollama-1 | at Object.fetch (node:internal/deps/undici/undici:11576:11) chatollama-1 | at process.processTicksAndRejections (node:internal/process/task_queues:95:5) chatollama-1 | Ollama: { chatollama-1 | host: 'http://host.docker.internal:11434', chatollama-1 | username: null, chatollama-1 | password: null chatollama-1 | }
请在ChatOllama容器里试试,看看网络通信是否正常
curl http://host.docker.internal:11434/api/chat -d '{
"model": "llama2:13b",
"messages": [
{ "role": "user", "content": "why is the sky blue?" }
]
}'
@sugarforever 在docker里面curl是不通的,直接localhost是可以的,我看有人说是linux的问题,我我正在看 https://stackoverflow.com/questions/48546124/what-is-the-linux-equivalent-of-host-docker-internal
(base) coty@p16:~/OneDrive/Productive/LLM/chat-ollama$ curl http://host.docker.internal:11434/api/chat -d '{ "model": "llama2:13b", "messages": [ { "role": "user", "content": "why is the sky blue?" } ] }' (base) coty@p16:~/OneDrive/Productive/LLM/chat-ollama$ curl http://localhost:11434/api/chat -d '{ "model": "llama2:13b", "messages": [ { "role": "user", "content": "why is the sky blue?" } ] }' {"model":"llama2:13b","created_at":"2024-03-22T12:55:07.680349321Z","message":{"role":"assistant","content":"\n"},"done":false} {"model":"llama2:13b","created_at":"2024-03-22T12:55:07.697893942Z","message":{"role":"assistant","content":"The"},"done":false} {"model":"llama2:13b","created_at":"2024-03-22T12:55:07.715469577Z","message":{"role":"assistant","content":" sky"},"done":false}
应该好了,要用 http://172.17.0.1:11434/
其实就是没有连通 ollama 服务
嗯,应该就是没联动ollama,
chatollama Pulled 7.8s [+] Running 2/0 ✔ Container chat-ollama-chatollama-1 Created 0.0s ✔ Container chat-ollama-chromadb-1 Created 0.0s Attaching to chatollama-1, chromadb-1 chromadb-1 | Starting 'uvicorn chromadb.app:app' with args: --workers 1 --host 0.0.0.0 --port 8000 --proxy-headers --log-config chromadb/log_config.yml --timeout-keep-alive 30 chatollama-1 | Listening on http://[::]:3000 chromadb-1 | DEBUG: [22-03-2024 13:21:12] Registering provider: token_config chromadb-1 | DEBUG: [22-03-2024 13:21:12] Registering provider: user_token_config chromadb-1 | DEBUG: [22-03-2024 13:21:12] Registering provider: token chromadb-1 | DEBUG: [22-03-2024 13:21:12] Registering provider: token chromadb-1 | WARNING: [22-03-2024 13:21:12] chroma_server_nofile is set to 65535, but this is less than current soft limit of 1048576. chroma_server_nofile will not be set. chromadb-1 | INFO: [22-03-2024 13:21:12] Anonymized telemetry enabled. See https://docs.trychroma.com/telemetry for more information. chromadb-1 | DEBUG: [22-03-2024 13:21:12] Starting component System chromadb-1 | DEBUG: [22-03-2024 13:21:12] Starting component OpenTelemetryClient chromadb-1 | DEBUG: [22-03-2024 13:21:12] Starting component SimpleAssignmentPolicy chromadb-1 | DEBUG: [22-03-2024 13:21:12] Starting component SqliteDB chromadb-1 | DEBUG: [22-03-2024 13:21:12] Starting component QuotaEnforcer chromadb-1 | DEBUG: [22-03-2024 13:21:12] Starting component Posthog chromadb-1 | DEBUG: [22-03-2024 13:21:12] Starting component LocalSegmentManager chromadb-1 | DEBUG: [22-03-2024 13:21:12] Starting component SegmentAPI chromadb-1 | INFO: [22-03-2024 13:21:12] Started server process [1] chromadb-1 | INFO: [22-03-2024 13:21:12] Waiting for application startup. chromadb-1 | INFO: [22-03-2024 13:21:12] Application startup complete. chromadb-1 | INFO: [22-03-2024 13:21:12] Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit) chatollama-1 | Ollama: { host: 'http://172.17.0.1:11434', username: null, password: null } chatollama-1 | Authorization: dW5kZWZpbmVkOnVuZGVmaW5lZA== chatollama-1 | Ollama: { host: 'http://172.17.0.1:11434', username: null, password: null } chatollama-1 | Authorization: dW5kZWZpbmVkOnVuZGVmaW5lZA== chatollama-1 | Ollama: { host: 'http://127.0.0.1:11434', username: null, password: null } chatollama-1 | Ollama: { host: 'http://172.17.0.1:11434', username: null, password: null } chatollama-1 | Chat with Ollama, host: http://172.17.0.1:11434 chatollama-1 | Ollama: { host: 'http://172.17.0.1:11434', username: null, password: null } chatollama-1 | Chat with Ollama, host: http://172.17.0.1:11434
应该好了,要用 http://172.17.0.1:11434/
感谢分享
Chatting in the knowledge base, the system raise the below issue, (the current version of ollama:0.1.29 ; Nodejs:v20.11.1) ERROR [nuxt] [request error] [unhandled] [500] fetch failed at node:internal/deps/undici/undici:12345:11 at async createOllamaStream (/C:/Users/uatshap1033/Desktop/Dev/chat-ollama/node_modules/@langchain/community/dist/utils/ollama.js:9:22) at async createOllamaChatStream (/C:/Users/uatshap1033/Desktop/Dev/chat-ollama/node_modules/@langchain/community/dist/utils/ollama.js:57:5)
--Does anyone run into the same issue, how can I fix it, thanks.
BTW: no issues at the Chat page