Open oglok opened 1 month ago
BTW, I tried using dustynv's as base image instead of the l4t-base one, and I'm getting the same results :-(
hi @oglok seems you were using an older version that is now being deprecated
Hey @tqchen , is there a container image I can use ?
I see dockerfiles in the mlc-ai/packages repo, and mlc-ai/env repo, but nothing ready to use...
as of now unfortunately we don't have a container file unfortunately so maybe build from source for jetson is needed
@tqchen I realize there is no wheel package with cuda for ARM devices. Am I the only one interested in running this on a Jetson?
@oglok jetson-containers builds MLC from source, there are some patches I apply (mostly to MLC/TVM 3rd-party submodules) so it's not on the latest, however there is version using mlc_chat
builder (what I have tagged as version 0.1.1)
Also IIRC AttributeError: Module has no function '_metadata'
is just a warning and the program typically continues on without issue after that. Does it halt for you or you have some other problem?
Actually you are right @dusty-nv . The container does not halt, but I thought it wasn't behaving properly. But apparently it is.
What's the problem with generating wheel packages for cuda on arm? or for jetson?
@oglok the MLC/TVM wheels that jetson-containers builds are here: http://jetson.webredirect.org/jp6/cu122
It is a non-trivial build process with all the bells & whistles enabled, and as you have found there are extra dependencies and files you need to install.
@oglok the MLC/TVM wheels that jetson-containers builds are here: http://jetson.webredirect.org/jp6/cu122
It is a non-trivial build process with all the bells & whistles enabled, and as you have found there are extra dependencies and files you need to install.
❤️
❓ General Questions
I'm trying to build a containerized application with vicuna-7b and mlc-llm for Jetsons with JP6. This is my multi-phase Containerfile:
When I run the container, I get the following error:
I see that library is located in:
So If I add export LD_LIBRARY_PATH=/usr/local/lib/python3.10/dist-packages/tvm/:$LD_LIBRARY_PATH, my vicuna server starts:
However, when I make use of it (the demo is basically a webserver, yolov8 and this vicuna server) I get an error which seems about some dependency mismatch:
Any clue?
It seems to be complaining about this function: https://GitHub.com/mlc-ai/mlc-llm/blob/main/python/mlc_llm/cli/model_metadata.py#L20