TabbyML / tabby

Self-hosted AI coding assistant
https://tabby.tabbyml.com/
Other
18.25k stars 767 forks source link

bug: out of source build fails #2479

Closed Kamilcuk closed 6 days ago

Kamilcuk commented 6 days ago

Describe the bug

  1. Create the following dockerfile:
FROM rust AS build
RUN set -x && \
        apt-get update -y && \
        apt-get install --no-install-recommends -y cmake git git-lfs && \
        rm -rf /var/lib/apt/lists/*
ENV RUSTFLAGS='-C target-cpu=native'
ARG VERSION=v0.13.0-rc.0
RUN \
        --mount=type=cache,target=/cache \
        set -x && \
        cd /cache && \
        ( ls -la ./* || true ) && \
        ( du -hs ./* || true ) && \
        if [ -e ./tabby ]; then \
                cd ./tabby && \
                git fetch --all && \
                git checkout --recurse-submodules $VERSION ; \
        else \
                git clone --branch $VERSION --depth 1 --recurse-submodules https://github.com/TabbyML/tabby ./tabby && \
                cd ./tabby ; \
        fi && \
        git status && \
        git log -1 && \
        CARGO_HOME=/cache/cargo cargo build --release --target-dir /cache/build && \
        cd ../build/tabby/target/release/ && \
        cp -va ./tabby ./llama-server /usr/local/bin
  1. Execute docker build .
  2. Observe the following error:
------                                                                                                 
 > [build 3/4] RUN      --mount=type=cache,target=/cache        set -x &&       cd /cache &&    ( ls -l
a ./* || true ) &&       ( du -hs ./* || true ) &&         if [ -e ./tabby ]; then                 cd .
/tabby &&           git fetch --all &&              git checkout --recurse-submodules v0.13.0-rc.0 ;  e
lse            git clone --branch v0.13.0-rc.0 --depth 1 --recurse-submodules https://github.com/TabbyM
L/tabby ./tabby &&                cd ./tabby ;    fi &&   git status &&   git log -1 &&   CARGO_HOME=/c
ache/cargo cargo build --release --target-dir /cache/build &&        cd ../build/tabby/target/release/ 
&&    cp -va ./tabby ./llama-server /usr/local/bin:                                                    
304.7   -- Up-to-date: /cache/build/release/build/llama-cpp-server-a1fc25be26f26043/out/bin/imatrix    
304.7   -- Installing: /cache/build/release/build/llama-cpp-server-a1fc25be26f26043/out/bin/server     
304.7   -- Up-to-date: /cache/build/release/build/llama-cpp-server-a1fc25be26f26043/out/bin/export-lora
304.7   cargo:root=/cache/build/release/build/llama-cpp-server-a1fc25be26f26043/out                    
304.7                                                                                                  
304.7   --- stderr                                                                                     
304.7   thread 'main' panicked at crates/llama-cpp-server/build.rs:68:10:                              
304.7   Failed to copy server binary to output directory: No such file or directory (os error 2)       
304.7   note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace                  
304.7 warning: build failed, waiting for other jobs to finish...                                       
------                                             

Information about your version v0.13.0-rc.0

Information about your GPU No gpu

Additional context I am trying to better cache the building in a dockerfile.

Thanks.

Kamilcuk commented 6 days ago

solution: use export CARGO_HOME=/cache/cargo CARGO_INCREMENTAL=1 CARGO_CACHE_DIR=/cache/build. Thx, ignore me, closing.