Open amd1890 opened 1 month ago
docker run -it ...
I am trying to get Perplexica, a local open-source version of Perplexity.ai, running and using an ollama that accesses rocm. I have gotten perplexica (https://github.com/ItzCrazyKns/Perplexica) working using a regular ollama in a docker but it's painfully slow. My system has rocm and supports Vulkan and I have Ryzen 780M. I am trying to get ollama set up in a docker to use rocm.
I think when I wrote this issue before, I didn't understand that you were validating programs as working correctly with rocm. Perhaps I should have titled the issue "Please help me."
https://github.com/avnigashi/ollama-rocm-docker is one of the repos that isn't seeming to work for me.
https://github.com/hqnicolas/OllamaDockerCasaOs is another repo I am not entirely understanding. I don't understand why casaos is part of this equation. This version runs, but generates errors in the docker terminal "Error: unknown flag: --device" and I'm not sure if it's running correctly.
my experience level is very low, so it's harder for me to understand things than a more experienced user
I've been trying to get this to work in docker which is using ollama and is painfully slow
It doesn't recognize the GPU and I don't know how to implement the commands inside the docker image or get the image to recognize the GPU.
Anyone have suggestions?