Open yangboz opened 1 week ago
RAGFlow has already supported ollama. Configure it in settings >> model providers
RAGFlow has already supported ollama. Configure it in settings >> model providers please give a live example picture better than hundreds of words.
in mac , after running ollama, even checked it with curl localhost:11434
it returns "ollama is running".
I don't think localhost is the right IP since it is not reachable in docker container. So, try to use the real machine IP.
I don't think localhost is the right IP since it is not reachable in docker container. So, try to use the real machine IP.
I don't think so , becuz I installed the ollama with dmg, also curled "curl localhost:11434" got "ollama is running" response, if localhost not working, how to integrate it in rag flow model configure window ? or only xinference works?
I don't think localhost is the right IP since it is not reachable in docker container. So, try to use the real machine IP.
after export OLLAMA_HOST=0.0.0.0:11434
, ollama serve
, still got `Fail to access model(qwen).**ERROR**: [Errno 111] Connection refused
Describe your problem
try to have a try ollama model integration, but don't known how, maybe it needs more document. related to #
can't wait to use ollama in local.