TabbyML / tabby

Self-hosted AI coding assistant
https://tabbyml.com
Other
21.87k stars 997 forks source link

Build fails: Failed to copy server binary to output directory: No such file or directory (os error 2) #3399

Open yurivict opened 4 days ago

yurivict commented 4 days ago

Describe the bug

 -- Installing: /wrkdirs/usr/ports/devel/tabby/work/target/release/build/llama-cpp-server-8837603d1835d022/out/bin/llama-tokenize
  cargo:root=/wrkdirs/usr/ports/devel/tabby/work/target/release/build/llama-cpp-server-8837603d1835d022/out

  --- stderr
  CMake Warning at cmake/build-info.cmake:14 (message):
    Git not found.  Build info will not be accurate.
  Call Stack (most recent call first):
    CMakeLists.txt:77 (include)

  CMake Warning at ggml/src/CMakeLists.txt:274 (message):
    AMX requires gcc version > 11.0.  Turning off GGML_AMX.

  CMake Warning at common/CMakeLists.txt:30 (message):
    Git repository not found; to enable automatic generation of build info,
    make sure Git is installed and the project is a Git repository.

  CMake Warning:
    Manually-specified variables were not used by the project:

      CMAKE_ASM_COMPILER
      CMAKE_ASM_FLAGS

  thread 'main' panicked at crates/llama-cpp-server/build.rs:66:36:
  Failed to copy server binary to output directory: No such file or directory (os error 2)

Information about your version 0.20.0

Additional context FreeBSD 14.1

zwpaper commented 4 days ago

Hi @yurivict, it looks like the llama-server is not successfully built, can you provide some more details, like

  1. the command you used
  2. the full logs output

so that we can dig deeper into it

yurivict commented 4 days ago

Here is a complete log.

Also: we already have llama-cpp package available, so there should be no need to bundle it in tabby. Is it possible to use the external llama-cpp package?

wsxiaoys commented 4 days ago

Also: we already have llama-cpp package available, so there should be no need to bundle it in tabby.

Yes - you can turn off the llama-cpp binary building by disabling this feature: https://github.com/TabbyML/tabby/blob/39c5b8de0f9d17d8b6867cc3eec305bf73b304b1/crates/llama-cpp-server/Cargo.toml#L9