-
I wish to chat to tabby like in your demo page.
But from a bash command line..
At the moment I just used the docker container.. so I have /opt/tabby/bin/tabby and tabby-cpu
How to get a prompt?
Zibri updated
1 month ago
-
When downloading the model "bigcode/starcoder" and embedder "bert-nli-mean-tokens", although I have deleted the HuggingFace folder and all things of assignment 1 from the device, the disk still run ou…
-
@b4rtaz Hey, thank you for your wonderful work. Could you please offer some details about how to add supported model? For example, how to to convert some ollama models like command+r or starcoder or l…
-
Hello!
I want to convert starcoder to fastertransformer format for inference,Here is the link:[https://huggingface.co/bigcode/starcoder](url)
This model belongs to GPTBigCode:[https://huggingface.co…
-
### System Info
nvidia A100 80G
centos7 x86_64
### Who can help?
@ncomly-nvidia @kaiyux @juney-nvidia
### Information
- [X] The official example scripts
- [ ] My own modified scripts
### Tasks
…
-
Hi. Thank you for your great work. You approach is helpful to me. I am trying to fine-tune starcoder to enhance its C code performance. So your cost of fine-tune starcoder is helpful to me. Could you …
-
(vicuna) ahnlab@ahnlab-desktop:~/GPT/StarCoder/GPTQ-for-SantaCoder$ python -m santacoder_inference bigcode/starcoderbase --wbits 4 --groupsize 128 --load starcoderbase-GPTQ-4bit-128g/model.pt
Loading…
-
```yaml
services:
tabby:
restart: always
image: tabbyml/tabby
entrypoint: /opt/tabby/bin/tabby-cpu
command: serve --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct
vol…
-
python3 convert-hf-to-ggml.py bigcode/starcoderplus
is failing with error: RuntimeError: PytorchStreamReader failed reading zip archive: failed finding central directory
```
% python3 convert-…
-
There is currently only
- baichat
- catgpt
- huggingchat
- openaigpt35turbo
enabled by default.
But there is
- alpacalora
- baichat
- bard (disabled for now)
- catgpt
- hfdialogpt
- hfg…