TabbyML / tabby

Self-hosted AI coding assistant
https://tabby.tabbyml.com/
Other
18.28k stars 771 forks source link

mac M1 can not start tabby #2413

Open Xiaomingpapapa opened 2 weeks ago

Xiaomingpapapa commented 2 weeks ago

Describe the bug I followed the guide in this doc: https://tabby.tabbyml.com/docs/quick-start/installation/apple/ after i installed tabby use brew, and when i run below command: tabby serve --device metal --model StarCoder-1B

the terminal just like frozen and can not see any log output

Information about your version tabby 0.12.0

Information about your GPU mac os M1

wsxiaoys commented 2 weeks ago

Hi, can you run command again with following env vars?

RUST_LOG=debug RUST_BACKTRACE=1 tabby serve ...
Xiaomingpapapa commented 2 weeks ago

@wsxiaoys I run command again with following env vars, it occurs to many logs, and i find it retry many times to start llama-server: 2024-06-14T08:09:54.319865Z WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:88: llama-server exited with status code -1, restarting...

Xiaomingpapapa commented 2 weeks ago

@wsxiaoys do you have any idea for it?

coljiang commented 1 week ago

I have the same problem run command

RUST_LOG=debug RUST_BACKTRACE=1  tabby serve --device metal  --model StarCoder-1B --port 9823

output

2024-06-20T09:55:00.350736Z DEBUG hyper_util::client::legacy::connect::http: /Users/runner/.cargo/registry/src/index.crates.io-6f17d22bba15001f/hyper-util-0.1.5/src/client/legacy/connect/http.rs:631: connecting to 127.0.0.1:7890
2024-06-20T09:55:00.351082Z DEBUG hyper_util::client::legacy::connect::http: /Users/runner/.cargo/registry/src/index.crates.io-6f17d22bba15001f/hyper-util-0.1.5/src/client/legacy/connect/http.rs:634: connected to 127.0.0.1:7890
2024-06-20T09:55:01.187690Z  WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:88: llama-server exited with status code -1, restarting...
Madd0g commented 1 week ago

happens to me too with 0.12.0, I see a single log line with "Waiting for llama-server to start"

and then a ton of these:

connecting to 127.0.0.1:30888
starting new connection: http://127.0.0.1:30888/
Xiaomingpapapa commented 1 week ago

happens to me too with 0.12.0, I see a single log line with "Waiting for llama-server to start"

and then a ton of these:

connecting to 127.0.0.1:30888
starting new connection: http://127.0.0.1:30888/

i encountered this issue too, do you have any progress on it now?

Madd0g commented 1 day ago

so for me this was the solution:

  1. upgrade to recent version of tabby (I saw release notes about better logging)
  2. realize there are indeed new logs, see new errors from llama.cpp
  3. google the errors for a while Symbol not found: (_cblas_sgemm$NEWLAPACK$ILP64)
  4. see someone recommending upgrading macos
  5. upgrade to 14.5
  6. problem solved!
wsxiaoys commented 1 day ago

Hi @Madd0g - thanks for sharing the process, glad that improved logging actually help you locating the issue :)