-
Tabby terminal
-
**Describe the bug**
Getting segmentation fault while running tabby
```
~ tabby serve --device metal --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct
🔑 Parsed .netrc from: .netrc
Writing …
-
**Please describe the feature you want**
As an configurable field of `HttpModelConfig` or `LocalModelConfig`.
Configured stop words shall be used to construct stop words at:
1. https://github…
-
**Describe the bug**
I was looking for a way to index my `local` repository on a M1/M2 Mac,
I have a `~/.tabby/config.ml` file which is as such
```
[[repositories]]
name = "Test_repo"
git_…
-
**Describe the bug**
Chat window is empty with "Nothing to show"
**Information about your version**
I have built tabby from v0.17.0 with command `cargo --features rocm build`
**Information abo…
-
**Describe the bug**
DeepSeek-Coder-V2-Lite is stuck at starting.
The web server does not even satrt after 10,000 s of “Starting“. Other models work fine. If within the container I kill the llama-…
m0wer updated
11 hours ago
-
### Check for existing issues
- [X] Completed
### Describe the feature
[Tabby](https://tabby.tabbyml.com/) is open source, self-hosted AI coding assistant.
I think Tabby can improve the coding…
-
There is currently no first-class code local assistant plugin for Sublime-LSP. [Github Copilot](https://github.com/TerminalFi/LSP-copilot) and [Codeium](https://codeium.com/sublime_tutorial) plugins e…
-
### OS
Linux
### GPU Library
CUDA 12.x
### Python version
3.12
### Describe the bug
If you start the server without a model, or unload the model first before stopping the server, it will throw …
-
The tabby-rocm Docker Image is outdated by about 3 months.
See:
https://hub.docker.com/r/tabbyml/tabby-rocm/tags
Output of `tabby --version`:
`tabby 0.11.1`