khoj-ai / khoj

Your AI second brain. Get answers to your questions, whether they be online or in your own notes. Use online AI models (e.g gpt4) or private, local LLMs (e.g llama3). Self-host locally or use our cloud instance. Access from Obsidian, Emacs, Desktop app, Web or Whatsapp.
https://khoj.dev
GNU Affero General Public License v3.0
12.63k stars 640 forks source link

[FIX]macbook m2 Ollama / Khoj error #813

Closed moffefei closed 3 months ago

moffefei commented 3 months ago

docker khoj-server log↓

khoj ollama server error log.md

To Reproduce

follow the steps : https://docs.khoj.dev/miscellaneous/ollama/

Screenshots

image

Platform

If self-hosted

sabaimran commented 3 months ago

Hey @moffefei , can you share a screenshot of your ollama chat config with me?

moffefei commented 3 months ago

Hey @moffefei , can you share a screenshot of your ollama chat config with me? here the ollama chat config:

image image
moffefei commented 3 months ago

Hey @moffefei , can you share a screenshot of your ollama chat config with me?

upgraded docker & ollama already.

sabaimran commented 3 months ago

No, I mean inside khoj. I'd like to see the open ai processor conversation settings in the admin page.

moffefei commented 3 months ago

No, I mean inside khoj. I'd like to see the open ai processor conversation settings in the admin page.不,我是说在 khoj 里面。我想在管理页面中查看打开的人工智能处理器对话设置。 sorry, but this page?

截屏2024-06-12 12 11 06
moffefei commented 3 months ago

try much but not fixed

sabaimran commented 3 months ago

No, I mean in the admin panel, which should be here: http://localhost:42110/server/admin/database/openaiprocessorconversationconfig/

Screenshot 2024-06-12 at 16 46 47

Make sure the URL you're using is http://localhost:11434/v1/. The /v1 is important.

mominfaruk commented 3 months ago

Same issue is happening for macbook m1

Screenshot 2024-06-12 at 6 07 59 PM
adityanemali commented 3 months ago

Upgrading ollama worked for me.

curl -fsSL https://ollama.com/install.sh | sh

moffefei commented 3 months ago

No, I mean in the admin panel, which should be here: http://localhost:42110/server/admin/database/openaiprocessorconversationconfig/ Screenshot 2024-06-12 at 16 46 47

Make sure the URL you're using is http://localhost:11434/v1/. The /v1 is important.

fixed it!Because deployed 'ollama' both in Docker and locally, there seemed to be an environment conflict. After replacing 'http://localhost:11434/v1/' with 'http://host.docker.internal:11434/v1/', resolved the issue.

image