-
https://github.com/huggingface/transformers/blob/c2820c94916e34baf4486accae74760972183a2f/src/transformers/integrations/peft.py#L71
When you load a trained adapter using `load_adapter` it takes the…
-
### Describe the bug
I installed text generation webui and downloaded the model(TheBloke_Yarn-Mistral-7B-128k-AWQ) and I can't run it. I chose Transofmer as Model loader. I tried installing autoawq b…
-
Traceback (most recent call last):
File "/root/autodl-tmp/img2img-turbo-main/src/train_cyclegan_turbo.py", line 390, in
main(args)
File "/root/autodl-tmp/img2img-turbo-main/src/train_cycle…
-
Hello,
It would be helpful to include documentation on how to trace a decoder-only transformer model for hosting on Inferentia. Currently, the only documentation that exists is for Encoder-Decoder …
-
When run demo.py with default parameter is correct , but with the follow command is error:
python .\demo.py --image_path .\demo.png --ckpt_path U4R/StructTable-InternVL2-1B --output_format latex
T…
-
### 🐛 Describe the bug
I'm experiencing an issue exporting the `Mixtral-8x7B` model.
I originally discussed the problem in #130274, but this is more about `torch.export`. My ultimate goal is to g…
-
### System Info / 系統信息
I'm getting the following error when installing the dependencies for GLM-4V-9B. What could be the reason?
ERROR: pip's dependency resolver does not currently take into accou…
-
Hey,
thanks for the great work! Would you be interested in adding the necessary adjustments/configs to HuggingFace, so that the model can be loaded with the ```Auto``` functions from Huggingface `…
-
Hello,
I'm trying to setting up a local SentenceTransformerEmbeddingModel :
```
sentence_transformer = SentenceTransformerEmbeddingModel(name='my-embedding-model',
…
-
### System Info
torch 2.4.1
transformers 4.46.0.dev0
trl 0.11.2
peft 0.13.1
GPU V100
CUDA …