TabbyML / tabby

Self-hosted AI coding assistant
https://tabbyml.com
Other
21.88k stars 997 forks source link

Run chat in command line #2083

Open Zibri opened 6 months ago

Zibri commented 6 months ago

I wish to chat to tabby like in your demo page. But from a bash command line.. At the moment I just used the docker container.. so I have /opt/tabby/bin/tabby and tabby-cpu

How to get a prompt?

Zibri commented 6 months ago

In other words, I wish to setup up locally a similar or same model you have on your demo page. And test it out without an internet connection active

Zibri commented 6 months ago
/opt/tabby/bin/tabby-cpu serve --model StarCoder-1B --chat-model StarCoder-3B 
Writing to new file.
🎯 Downloaded https://huggingface.co/TabbyML/models/resolve/main/starcoderbase-3B.Q8_0.gguf to /data/models/TabbyML/StarCoder-3B/ggml/q8_0.v2.gguf.tmp
   00:01:00 ▕████████████████████▏ 3.15 GiB/3.15 GiB  53.01 MiB/s  ETA 0s.                                                                                         2024-05-09T16:19:26.760612Z  INFO tabby::serve: crates/tabby/src/serve.rs:123: Starting server, this might take a few minutes...
2024-05-09T16:19:28.837563Z ERROR tabby::services::model: crates/tabby/src/services/model/mod.rs:32: Chat model requires specifying prompt template
Zibri commented 6 months ago

Or in alternative I wish to have the same you have at https://demo.tabbyml.com/playground but locally.

ge-hall commented 5 months ago

@Zibri

Or in alternative I wish to have the same you have at https://demo.tabbyml.com/playground but locally.

The Chat Playground is available at http://localhost:8080/playground on the machine you have tabby running.