dusty-nv / jetson-containers

Machine Learning Containers for NVIDIA Jetson and JetPack-L4T
MIT License
1.88k stars 416 forks source link

No Jupyter Server and Ollama Server when running llama-index #554

Open Haruyukis opened 3 weeks ago

Haruyukis commented 3 weeks ago

Hello @dusty-nv

Here the command that I type as written in the README of Llama-Index. However, I can't understand why the Jupyter Server and Ollama Server aren't launching when running the container...

I can't find any similar situation online, sorry if someone already talk about it. Total beginner when it comes to Jetson.

alex@ubuntu:~/jetson-containers$ jetson-containers run $(autotag llama-index)
Namespace(disable=[''], output='/tmp/autotag', packages=['llama-index'], prefer=['local', 'registry', 'build'], quiet=False, user='dustynv', verbose=False)
-- L4T_VERSION=35.4.1  JETPACK_VERSION=5.1.2  CUDA_VERSION=11.4
-- Finding compatible container image for ['llama-index']
dustynv/llama-index:r35.4.1
localuser:root being added to access control list
+ docker run --runtime nvidia -it --rm --network host --volume /tmp/argus_socket:/tmp/argus_socket --volume /etc/enctune.conf:/etc/enctune.conf --volume /etc/nv_tegra_release:/etc/nv_tegra_release --volume /tmp/nv_jetson_model:/tmp/nv_jetson_model --volume /var/run/dbus:/var/run/dbus --volume /var/run/avahi-daemon/socket:/var/run/avahi-daemon/socket --volume /var/run/docker.sock:/var/run/docker.sock --volume /home/alex/jetson-containers/data:/data --device /dev/snd --device /dev/bus/usb -e DISPLAY=:1 -v /tmp/.X11-unix/:/tmp/.X11-unix -v /tmp/.docker.xauth:/tmp/.docker.xauth -e XAUTHORITY=/tmp/.docker.xauth --device /dev/i2c-0 --device /dev/i2c-1 --device /dev/i2c-2 --device /dev/i2c-3 --device /dev/i2c-4 --device /dev/i2c-5 --device /dev/i2c-6 --device /dev/i2c-7 --device /dev/i2c-8 --device /dev/i2c-9 dustynv/llama-index:r35.4.1
root@ubuntu:/# 
Haruyukis commented 1 week ago

It feels like that Jupyter and Ollama aren't installed inside the docker images.

Indeed, when trying to use the following command:

ollama start

Hi ! I'm still still stuck >.<

It feels like Ollama is not installed inside the Docker Image. Am I missing something ?

I receive the following error:

bash: ollama: command not found

I can install it manually inside the container, however, aren't the packages already inside the docker images ? Am I missing something ?

dusty-nv commented 1 week ago

Hi @Haruyukis, sorry for the delay - Jupyter and Ollama are built into the llama-index:samples container. It looks like the page on Jetson AI Lab refers to the samples container, but not the docs in this repo - will correct that. I will build images of those variants and push them to DockerHub for JP5/JP6.