alexhegit / Playing-with-ROCm

See how to play with ROCm, run it with AMD GPUs!
MIT License
9 stars 0 forks source link

Not Sure How to Apply This to Docker ollama/ollama:rocm #2

Open amd1890 opened 1 month ago

amd1890 commented 1 month ago

I've been trying to get this to work in docker which is using ollama and is painfully slow

It doesn't recognize the GPU and I don't know how to implement the commands inside the docker image or get the image to recognize the GPU.

Anyone have suggestions?

alexhegit commented 1 month ago
  1. Where is the URL of this docker image?
  2. What is the full command to run the docker image you used? something like docker run -it ...
amd1890 commented 1 week ago

I am trying to get Perplexica, a local open-source version of Perplexity.ai, running and using an ollama that accesses rocm. I have gotten perplexica (https://github.com/ItzCrazyKns/Perplexica) working using a regular ollama in a docker but it's painfully slow. My system has rocm and supports Vulkan and I have Ryzen 780M. I am trying to get ollama set up in a docker to use rocm.

I think when I wrote this issue before, I didn't understand that you were validating programs as working correctly with rocm. Perhaps I should have titled the issue "Please help me."

https://github.com/avnigashi/ollama-rocm-docker is one of the repos that isn't seeming to work for me.

https://github.com/hqnicolas/OllamaDockerCasaOs is another repo I am not entirely understanding. I don't understand why casaos is part of this equation. This version runs, but generates errors in the docker terminal "Error: unknown flag: --device" and I'm not sure if it's running correctly.

my experience level is very low, so it's harder for me to understand things than a more experienced user