-
There is a growing need within the ML community to perform efficient model inference using high-performance Llama models, such as Llama 3.2 (1B or 3B parameters), on Google TPU V4 hardware. However, s…
-
### Describe the bug
I installed text generation webui and downloaded the model(TheBloke_Yarn-Mistral-7B-128k-AWQ) and I can't run it. I chose Transofmer as Model loader. I tried installing autoawq b…
-
Would you consider supporting [Transformers.js](https://github.com/huggingface/transformers.js) as an additional provider? This library provides access to text generation and content classification fu…
-
### Feature request
Optimize Transformers' image_processors to decrease image processing time, and reduce inference latency for vision models and vlms.
### Motivation
The Transformers library relie…
-
### System Info
transformers version: 44.2
python version: 3.11.6
system OS: Linux
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [X] My own modified scrip…
-
Molmo7BDbnb
all() received an invalid combination of arguments - got (Tensor, dim=tuple, keepdim=bool), but expected one of: * (Tensor input, *, Tensor out) didn't match because some of the keywords …
ilqin updated
3 weeks ago
-
### What behavior of the library made you think about the improvement?
I need to install torch, transformers, accelerate etc. even if I want to use outlines only with llamacpp backend.
Are these d…
-
### Feature request
https://github.com/huggingface/transformers/pull/28556
The core transformers library is moving to allow an extra parameter for the WhipserModel to bring initial_prompt in.
A…
-
_Filing this question on behalf of a user from a private thread._
The auto-generated plan sees that a column is PII, but it is SSN without the dashes …. With the sdtype = ssn still work given no da…
-
### System Info
System Info
Python version: 3.11.0
PyTorch version: 2.4.1 or 2.5.0
Transformers version: 4.46.0
TRL version: 0.11.4
PEFT version: 0.13.2
### In…