When use tabbyml/tabby:latest, we got
WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:88: llama-server exited with status code 127, restarting...
in container try run llama-server we got
llama-server: error while loading shared libraries: libcuda.so.1: cannot open shared object file: No such file or directory
So we add environment LD_LIBRARY_PATH to fix this base image bug
When use
tabbyml/tabby:latest
, we gotWARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:88: llama-server exited with status code 127, restarting...
in container try runllama-server
we gotllama-server: error while loading shared libraries: libcuda.so.1: cannot open shared object file: No such file or directory
So we add environmentLD_LIBRARY_PATH
to fix this base image bug