-
Currently building StarCoder GPT variant fails if smooth quant applied.
Is it planned to be supported? Any advice on how to build it?
ttim updated
12 months ago
-
Hey~, I got some problems with the cleaning of generation codes. It would be appreciated if you could help me out.
After I used clean_generations.py code to clean the generated translations of StarCo…
-
python3 convert-hf-to-ggml.py bigcode/starcoderplus
is failing with error: RuntimeError: PytorchStreamReader failed reading zip archive: failed finding central directory
```
% python3 convert-…
-
I am facing some issues whe using Deep Speed for fine tuning StarCoder Model. I am exactly following the steps mentioned in this article [Creating a Coding Assistant with StarCoder](https://huggingfac…
-
Hi, I have a question about the WizardCoder model, especially in 1B size. I read the paper about WizardCoder and can see that WizardCoder-15B is fine-tuned from StarCoder 15B. I want to ask whether th…
-
Hello!
I want to convert starcoder to fastertransformer format for inference,Here is the link:[https://huggingface.co/bigcode/starcoder](url)
This model belongs to GPTBigCode:[https://huggingface.co…
-
Hi,
First of all, thank you for your excellent work on this project. I tried to run your code on my machine, but I encountered an error that seems to indicate an issue with loading the model from a n…
-
**Describe the bug**
Getting segmentation fault while running tabby
```
~ tabby serve --device metal --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct
🔑 Parsed .netrc from: .netrc
Writing …
-
Hi,I am trying to run the fine-tuning code on my computer, but I got KeyError: 'response',the environment is installed according to the README.
Traceback (most recent call last):
File "/home/star…
-
I installed all the dependencies by following the instructions from the repo.
Following that, I am running the following code:
```
import torch
from transformers import AutoModelForCausalLM, A…