Open Zibri opened 6 months ago
In other words, I wish to setup up locally a similar or same model you have on your demo page. And test it out without an internet connection active
/opt/tabby/bin/tabby-cpu serve --model StarCoder-1B --chat-model StarCoder-3B
Writing to new file.
🎯 Downloaded https://huggingface.co/TabbyML/models/resolve/main/starcoderbase-3B.Q8_0.gguf to /data/models/TabbyML/StarCoder-3B/ggml/q8_0.v2.gguf.tmp
00:01:00 ▕████████████████████▏ 3.15 GiB/3.15 GiB 53.01 MiB/s ETA 0s. 2024-05-09T16:19:26.760612Z INFO tabby::serve: crates/tabby/src/serve.rs:123: Starting server, this might take a few minutes...
2024-05-09T16:19:28.837563Z ERROR tabby::services::model: crates/tabby/src/services/model/mod.rs:32: Chat model requires specifying prompt template
Or in alternative I wish to have the same you have at https://demo.tabbyml.com/playground but locally.
@Zibri
Or in alternative I wish to have the same you have at https://demo.tabbyml.com/playground but locally.
The Chat Playground is available at http://localhost:8080/playground on the machine you have tabby running.
I wish to chat to tabby like in your demo page. But from a bash command line.. At the moment I just used the docker container.. so I have /opt/tabby/bin/tabby and tabby-cpu
How to get a prompt?