Closed Aspirinkb closed 2 months ago
Find the same situation on Ollama repo issue
I have resolved the issue because I made a silly mistake. In simple terms, I had already started the Ollama service in the system before launching the container. So when I executed ollama run phi3
inside the container, it was actually being processed by the Ollama service outside the container, not the one inside.
Therefore, when I shut down the Ollama service outside the container, started it inside the container, and tried running the model again, it worked successfully.
run
dustynv/ollama:r36.2.0
container on AGX Orin 64GB, os versionJetpack 6.0 DP [L4T 36.2.0]
, ollama can not run LLM: