Open yurivict opened 4 days ago
Hi @yurivict, it looks like the llama-server is not successfully built, can you provide some more details, like
so that we can dig deeper into it
Here is a complete log.
Also: we already have llama-cpp package available, so there should be no need to bundle it in tabby. Is it possible to use the external llama-cpp package?
Also: we already have llama-cpp package available, so there should be no need to bundle it in tabby.
Yes - you can turn off the llama-cpp binary building by disabling this feature: https://github.com/TabbyML/tabby/blob/39c5b8de0f9d17d8b6867cc3eec305bf73b304b1/crates/llama-cpp-server/Cargo.toml#L9
Describe the bug
Information about your version 0.20.0
Additional context FreeBSD 14.1