Open meokey opened 7 months ago
是容器化运行的chatollama吗?容器内chatollama访问不到独立容器里的Ollama吧?应该需要建立docker容器间的网络
是容器化运行的chatollama吗?容器内chatollama访问不到独立容器里的Ollama吧?应该需要建立docker容器间的网络
是的,chatollama和ollama都是独立的容器。我用172的docker内ip地址不能访问,今天试了试吧ollama映射到host的port,用host的ip和port就可以了。但是同样的172的docker的ip地址,Ollama Open WebUI就可以访问,有点点奇怪不知道为什么。也可能是我docker的设置比较奇怪(是Synology的docker,不知道是不是魔改版)?
如果你觉得我这个错误没有普遍性,就关了吧。我已经找到原因修复可以用了。
我也遇到相同錯誤訊息,但我不知道該怎麼處理 我是ollama 、 chat-ollama 都是docker
錯誤訊息: Chat with Ollama, host: http://host.docker.internal:11434 chatollama-1 | Ollama: { host: 'http://127.0.0.1:11434', username: null, password: null } chatollama-1 | Ollama: { chatollama-1 | host: 'http://host.docker.internal:11434', chatollama-1 | username: null, chatollama-1 | password: null chatollama-1 | } chatollama-1 | Authorization: dW5kZWZpbmVkOnVuZGVmaW5lZA== chatollama-1 | Created knowledge base HVAC1: 6 chatollama-1 | Creating embeddings for Ollama served model: nomic-embed-text chatollama-1 | [nuxt] [request error] [unhandled] [500] Chroma getOrCreateCollection error: Error: TypeError: fetch failed chatollama-1 | at Chroma.ensureCollection (./.output/server/node_modules/@langchain/community/dist/vectorstores/chroma.js:99:23)
我也遇到相同錯誤訊息,但我不知道該怎麼處理 我是ollama 、 chat-ollama 都是docker
錯誤訊息: Chat with Ollama, host: http://host.docker.internal:11434 chatollama-1 | Ollama: { host: 'http://127.0.0.1:11434', username: null, password: null } chatollama-1 | Ollama: { chatollama-1 | host: 'http://host.docker.internal:11434', chatollama-1 | username: null, chatollama-1 | password: null chatollama-1 | } chatollama-1 | Authorization: dW5kZWZpbmVkOnVuZGVmaW5lZA== chatollama-1 | Created knowledge base HVAC1: 6 chatollama-1 | Creating embeddings for Ollama served model: nomic-embed-text chatollama-1 | [nuxt] [request error] [unhandled] [500] Chroma getOrCreateCollection error: Error: TypeError: fetch failed chatollama-1 | at Chroma.ensureCollection (./.output/server/node_modules/@langchain/community/dist/vectorstores/chroma.js:99:23)
我的解决方法是把container的port映射到主机host,然后ChatOllama用host的ip和port来连接。这个方法虽然安全性差些但能够work,呵呵 我以后还是有必要整理一下docker的网络设置,尽量在docker内部连接。
我也遇到相同錯誤訊息,但我不知道該怎麼處理 我是ollama 、 chat-ollama 都是docker 錯誤訊息: Chat with Ollama, host: http://host.docker.internal:11434 chatollama-1 | Ollama: { host: 'http://127.0.0.1:11434', username: null, password: null } chatollama-1 | Ollama: { chatollama-1 | host: 'http://host.docker.internal:11434', chatollama-1 | username: null, chatollama-1 | password: null chatollama-1 | } chatollama-1 | Authorization: dW5kZWZpbmVkOnVuZGVmaW5lZA== chatollama-1 | Created knowledge base HVAC1: 6 chatollama-1 | Creating embeddings for Ollama served model: nomic-embed-text chatollama-1 | [nuxt] [request error] [unhandled] [500] Chroma getOrCreateCollection error: Error: TypeError: fetch failed chatollama-1 | at Chroma.ensureCollection (./.output/server/node_modules/@langchain/community/dist/vectorstores/chroma.js:99:23)
我的解决方法是把container的port映射到主机host,然后ChatOllama用host的ip和port来连接。这个方法虽然安全性差些但能够work,呵呵 我以后还是有必要整理一下docker的网络设置,尽量在docker内部连接。
@githubjason-18 问题解决了吗
我也有類似的問題 我在 Setting 裡設置了 Azure OpenAI, 要等一下才能開始使在 Chat 裡挑選, 但在 knowledge base 裡的 Embedding 貼上 同一個 Azure OpenAI 的 APP Key 的 deploy model, 都一直 500 internal server error
用 docker logs 去看, 都說找不到 ollama, 修改 docker-compose.yml 內的 port 對應, 又說 ollama 的 11434 被 ollama 給佔據了. 所以我直接在 chatollama docker 裡安裝 ollama 並啟用. 但一樣無法使用 Azure OpenAI 所部署的 embedding model
[20240415] 我從 ollama pull mxbai-embed-large:latest 來測試 knowledge base, 但最後要寫入 Chroma 又發生錯誤了
Authorization: bnVsbDpudWxs Created knowledge base Aileen: 6 Creating embeddings for Ollama served model: mxbai-embed-large:latest Creating Chroma vector store Initializing ParentDocumentRetriever with RedisDocstore Redis client options: { host: 'redis', port: 6379, username: undefined, password: undefined } [nuxt] [request error] [unhandled] [500] Chroma getOrCreateCollection error: Error: TypeError: fetch failed at Chroma.ensureCollection (./.output/server/node_modules/@langchain/community/dist/vectorstores/chroma.js:99:23) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async Chroma.addVectors (./.output/server/node_modules/@langchain/community/dist/vectorstores/chroma.js:126:28) at async ParentDocumentRetriever.addDocuments (./.output/server/node_modules/langchain/dist/retrievers/parentdocument.js:167:13) at async ingestDocument (./.output/server/chunks//http.mjs:43:3) at async Object.handler (./.output/server/chunks/routes/api/index.post2.mjs:73:5) at async Object.handler (./.output/server/chunks/runtime.mjs:2963:19) at async Server.toNodeHandle (./.output/server/chunks/runtime.mjs:3229:7)
同样的报错:[nuxt] [request error] [unhandled] [500] Chroma getOrCreateCollection error: Error: TypeError: fetch failed
chatollama是通过docker compose安装的,ollama是官网下载安装的。
访问http://chromadb:8000/无法打开
访问http://localhost:8000/
{
"detail": "Not Found"
}
使用Create a New Knowledge Base时,具体报错信息如下:
Ollama: {
host: 'http://host.docker.internal:11434',
username: null,
password: null
}
Authorization: bnVsbDpudWxs
Created knowledge base party: 13
Creating embeddings for Ollama served model: nomic-embed-text
Creating Chroma vector store
Initializing ParentDocumentRetriever with RedisDocstore
Redis client options: { host: 'redis', port: 6379, username: undefined, password: undefined }
[nuxt] [request error] [unhandled] [500] Chroma getOrCreateCollection error: Error: TypeError: fetch failed
at Chroma.ensureCollection (./.output/server/node_modules/@langchain/community/dist/vectorstores/chroma.js:99:23)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Chroma.addVectors (./.output/server/node_modules/@langchain/community/dist/vectorstores/chroma.js:126:28)
at async ParentDocumentRetriever.addDocuments (./.output/server/node_modules/langchain/dist/retrievers/parentdocument.js:167:13)
at async ingestDocument (./.output/server/chunks//rag.mjs:106:3)
at async Object.handler (./.output/server/chunks/routes/api/index.post2.mjs:74:5)
at async Object.handler (./.output/server/chunks/runtime.mjs:2963:19)
at async Server.toNodeHandle (./.output/server/chunks/runtime.mjs:3229:7)
我是在相同的host上独立运行Ollama和Chroma的container。Chroma显示
{"detail":"Not Found"}
表示运行正常,Ollama Open WebUI连接Ollama没有问题,可以显示已经有3个模型了。我在chatollama配置了Ollama和Chroma的连接,也配置了Anthropic的API,但是不显示任何模型,Ollama和Claude的模型都不显示。可以保存Instruction。不能生成Knowledge Base显示fetch failed(应该是不能调用embedding吧)。
查看docker的logs,显示如下: '''Ollama: { host: 'http://172.17.0.11:11434', username: null, password: null } Authorization: dW5kZWZpbmVkOnVuZGVmaW5lZA== [nuxt] [request error] [unhandled] [500] fetch failed at Object.fetch (node:internal/deps/undici/undici:11576:11)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)''' 相同的connection string,Ollama Open WebUI没有问题可以连接。
ChatOllma,Ollama,Chroma都是最新版本。