Open jedld opened 1 month ago
I am also having a similar problem, I am trying to load any additional model from Gemma to Starcoder, and none of the ollama models library is loading, and gives me the same error
We are having the same issue. Is there any specific versoin used? We are using Jetpack 6.1 [L4T 36.4.0]. Thanks.
Hi @andrebaumgartfht can you try dustynv/jetson-copilot:r36.4.0
?
Thank you for the build.
Did the following: Created documents in folder in my jetson-container repo locally (mkdir -p ./data/documents/jetson) and added a pdf file to folder as an empty folder will cause another exception (potentially a defaulted RAG only container).
Then started the container using jetson-containers run dustynv/jetson-copilot:r36.4.0 bash -c '/start_ollama && streamlit run app.py
.
Loading and indexing the Jetson docs startet ...
Then raised the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/streamlit/runtime/scriptrunner/exec_code.py", line 88, in exec_func_with_error_handling
result = func()
File "/usr/local/lib/python3.10/dist-packages/streamlit/runtime/scriptrunner/script_runner.py", line 579, in code_to_exec
exec(code, module.dict)
File "/opt/jetson-copilot/app.py", line 55, in
When attempting to download "llama3.1" via the download new model UI, I'm getting:
It looks like "llama3.1" is not the right name.
This error does not happen for the other llama models.