-
### Feature request
The LLaMA 3 implementation should generate default `position_ids` that take the `attention_mask` into account.
@ArthurZucker @younesbelkada
### Motivation
Is there a s…
-
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
@torch.compile(backend="turbine_cpu")
def test_gpt2_demo():
tokenizer = AutoTokenizer.from_pretrained("gp…
-
**Is your feature request related to a problem? Please describe.**
Transformers that use LLM calls suffer from overloading their endpoints, resulting in errors like these:
```
2024-07-10T17:47:10…
tinco updated
2 months ago
-
A PMML representation of basic scikit-learn transformers and transformer pipelines / featureunions is needed for this to be more broadly useful.
-
I think that the current API is not very user friendly. Using Transformers would be a better approach. What do you think?
http://blog.danlew.net/2015/03/02/dont-break-the-chain/
-
**Describe the feature request**
It would be nice to be able to include bindings in transformers. Should match the format of existing JavaScript bindings: `$("binding_name")`
-
[X] I have checked the [documentation](https://docs.ragas.io/) and related resources and couldn't resolve my bug.
**Describe the bug**
I have a locally hosted LLM which I am intending to use as a jud…
-
### System Info
- `transformers` version: 4.45.1
- Platform: Linux-5.4.0-153-generic-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.25.1
- Safetensors version: 0.4.5…
-
Hi, I am getting following error during inference after the training is completed.
File "v2_main.py", line 156, in
generated_ids = model.generate(**inputs, max_new_tokens=40)
File "/home/u…
-
Hi Teams,
I'm trying to evaluate VideoLLaMA2 on MVBench. As I run the inference_video_mcqa_mvbench.py, the following traceback occurs:
```
Traceback (most recent call last):
File "/***/Video…