dusty-nv / jetson-containers

Machine Learning Containers for NVIDIA Jetson and JetPack-L4T
MIT License
2.36k stars 481 forks source link

Ollama tutorial example error #669

Open ta5946 opened 1 month ago

ta5946 commented 1 month ago

I am trying to serve with Ollama on the Jetson AGX Orin 64 GB developer kit.

The first example jetson-containers run --name ollama $(autotag ollama) works and the server responds on 127.0.0.1:11434. The second example docker run --runtime nvidia --rm --network=host -v ~/ollama:/ollama -e OLLAMA_MODELS=/ollama dustynv/ollama:r36.2.0 returns no error but does not work. I am using ollama:r36.2.0 in both cases.

Either way I want to override the OLLAMA_HOST variable.

ToeiRei commented 1 month ago

you may want to use -e OLLAMA_HOST=0.0.0.0 in your docker command. (feel free to replace 0.0.0.0 with whatever you'd like to override it)