dusty-nv / jetson-containers

Machine Learning Containers for NVIDIA Jetson and JetPack-L4T
MIT License
1.89k stars 416 forks source link

Models not downloading to jetson-containers folders #514

Open DR-DAT4 opened 1 month ago

DR-DAT4 commented 1 month ago

Hi, Apologies i seem to be unable to figure this out.

When using the Webui to download a model the list loads correctly and it also downloads giving the confirmation "Model successfully saved to models/Maykeye_TinyLLama-v0/." but in jetson-containers/data/models not even a text-generation-webui folder is created. No change when creating it manually, model still doesnt get saved.

disc usage analysis shows the downloads are stored in the var/liv/docker/overlay2

Using Jetson orin AGX 64 devkit, was all working fine on a 128gb ssd running jetpack 5.1.2 a few days ago. Put in a 1TB ssd and flashed it with 5.1.3 on the ssd (not eemc) and after downloading the models are not shown in the drop down or the file explorer.

I dont get what the hell i did this time differently.

orin2@ubuntu:~$ jetson-containers run $(autotag text-generation-webui)
Namespace(disable=[''], output='/tmp/autotag', packages=['text-generation-webui'], prefer=['local', 'registry', 'build'], quiet=False, user='dustynv', verbose=False)
-- L4T_VERSION=35.5.0  JETPACK_VERSION=5.1  CUDA_VERSION=11.4
-- Finding compatible container image for ['text-generation-webui']
dustynv/text-generation-webui:r35.4.1-cp310
localuser:root being added to access control list
+ docker run --runtime nvidia -it --rm --network host --volume /tmp/argus_socket:/tmp/argus_socket --volume /etc/enctune.conf:/etc/enctune.conf --volume /etc/nv_tegra_release:/etc/nv_tegra_release --volume /tmp/nv_jetson_model:/tmp/nv_jetson_model --volume /var/run/dbus:/var/run/dbus --volume /var/run/avahi-daemon/socket:/var/run/avahi-daemon/socket --volume /var/run/docker.sock:/var/run/docker.sock --volume /home/orin2/jetson-containers/data:/data --device /dev/snd --device /dev/bus/usb -e DISPLAY=:0 -v /tmp/.X11-unix/:/tmp/.X11-unix -v /tmp/.docker.xauth:/tmp/.docker.xauth -e XAUTHORITY=/tmp/.docker.xauth dustynv/text-generation-webui:r35.4.1-cp310
/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py:124: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.
  warnings.warn(
12:22:11-999751 INFO     Starting Text generation web UI                        
12:22:12-005047 WARNING                                                         
                         You are potentially exposing the web UI to the entire  
                         internet without any access password.                  
                         You can create one with the "--gradio-auth" flag like  
                         this:                                                  

                         --gradio-auth username:password                        

                         Make sure to replace username:password with your own.  
12:22:12-007377 INFO     Loading settings from "settings.yaml"                  

Running on local URL:  http://0.0.0.0:7860

Downloading the model to models/Maykeye_TinyLLama-v0
100%|██████████████████████████████████████████████████| 498   /498    1.54MiB/s
100%|███████████████████████████████████████████████████| 132   /132    265kiB/s
100%|██████████████████████████████████████████████████| 1.64k /1.64k  5.99MiB/s
100%|██████████████████████████████████████████████████| 503   /503    1.73MiB/s
100%|███████████████████████████████████████████████████| 411   /411    906kiB/s
100%|██████████████████████████████████████████████████| 649   /649    1.56MiB/s
100%|██████████████████████████████████████████████████| 534k  /534k   1.52MiB/s
100%|██████████████████████████████████████████████████| 1.52k /1.52k  3.55MiB/s
100%|██████████████████████████████████████████████████| 1.98M /1.98M  1.50MiB/s
100%|██████████████████████████████████████████████████| 9.25M /9.25M  5.25MiB/s
dusty-nv commented 1 month ago

Hi @DR-DAT4, I wonder if how oobabooga handles the model directories changed - what I would do is start the container manually into a shell:

jetson-containers run $(autotag text-generation-webui) /bin/bash

and then play around with launching oobabooga server.py with different options like is done here:

https://github.com/dusty-nv/jetson-containers/blob/aef3d09267c0b8564c6f3c18dc59873ce0a179d4/packages/llm/text-generation-webui/Dockerfile#L28

that should allow you to change around the model dir and see what is going on