TabbyML / tabby

Self-hosted AI coding assistant
https://tabby.tabbyml.com/
Other
21.31k stars 959 forks source link

Clashing between llama.cpp and tabby formula #2429

Open remyleone opened 3 months ago

remyleone commented 3 months ago

Describe the bug

I think there is a conflict between different formulas

$ brew link tabby 
Linking /opt/homebrew/Cellar/tabby/0.12.0... 
Error: Could not symlink bin/llama-server
Target /opt/homebrew/bin/llama-server
is a symlink belonging to llama.cpp. You can unlink it:
  brew unlink llama.cpp

To force the link and overwrite all conflicting files:
  brew link --overwrite tabby

To list all files that would be deleted:
  brew link --overwrite tabby --dry-run
(base) (k8s|admin@cli-k8s-beautiful-proskuriakova-7b18357b-9c0a-46ea-af9a-5427d97f17c2:default)
# rleone @ rleone-macbook in ~ [9:59:20] C:1
$ brew info tabby              
Warning: Treating tabby as a formula. For the cask, use homebrew/cask/tabby or specify the `--cask` flag.
==> tabbyml/tabby/tabby: stable 0.12.0, HEAD
Tabby: AI Coding Assitatnt
https://github.com/TabbyML/tabby
Installed
/opt/homebrew/Cellar/tabby/0.12.0 (7 files, 86.3MB)
  Built from source on 2024-06-06 at 09:34:50
From: https://github.com/tabbyml/homebrew-tabby/blob/HEAD/Formula/tabby.rb
==> Requirements
Required: macOS ✔, arm architecture ✔
==> Options
--HEAD
    Install HEAD version
==> Caveats
To start tabbyml/tabby/tabby now and restart at login:
  brew services start tabbyml/tabby/tabby
Or, if you don't want/need a background service you can just run:
  /opt/homebrew/opt/tabby/bin/tabby serve --device metal --model StarCoder-1B

Information about your version

0.12.0

Information about your GPU

macOS M1