dusty-nv / jetson-containers

Machine Learning Containers for NVIDIA Jetson and JetPack-L4T
MIT License
2.18k stars 446 forks source link

Text chat container example returns No module named local_llm.chat.__main__ #461

Open GJSea opened 6 months ago

GJSea commented 6 months ago

I'm trying to run the first Text Chat example using the published command line, the container seemed to have properly downloaded and built, but returning "No module named local_llm.chat.main; 'local_llm.chat' is a package and cannot be directly executed"

I'm getting the same also with the MultiModal chat example

Any idea?

Thanks!

./run.sh --env HUGGINGFACE_TOKEN= $(./autotag local_llm) python3 -m local_llm.chat --api=mlc --model=meta-llama/Llama-2-7b-chat-hf

_Namespace(disable=[''], output='/tmp/autotag', packages=['local_llm'], prefer=['local', 'registry', 'build'], quiet=False, user='dustynv', verbose=False) -- L4T_VERSION=35.2.1 JETPACK_VERSION=5.1 CUDA_VERSION=11.4 -- Finding compatible container image for ['local_llm'] dustynv/local_llm:r35.3.1 localuser:root being added to access control list