-
I had an issue when Queue the from comfyui. It shows :
ValueError: Phi3Transformer does not support an attention implementation through torch.nn.functional.scaled_dot_product_attention yet. Please …
-
Phi3Transformer does not support an attention implementation through torch.nn.functional.scaled_dot_product_attention yet. Please request the support for this architecture: https://github.com/huggin…
-
### Feature request
Add RemBERT to supported architectures for ONNX export.
### Motivation
The support for [RemBert](https://huggingface.co/docs/transformers/model_doc/rembert) was previously avail…
-
# Title
Introduction to Transformers and its various descendants like BERT, GPT-2, XLNet, etc
## Description
I will be introducing the Transformer architecture in full from its base(encoder, deco…
ghost updated
4 years ago
-
### Feature request
Recently, we have added the ability to load `gguf` files within [transformers](https://huggingface.co/docs/hub/en/gguf).
The goal was to offer the possibility to users …
-
https://github.com/huggingface/transformers/blob/a06a0d12636756352494b99b5b264ac9955bc735/src/transformers/generation/utils.py#L2022
The error message says:
> A decoder-only architecture is being …
-
![image](https://github.com/user-attachments/assets/3bc230bc-5029-4657-b107-0f1a1b54be15)
Error:
`Phi3Transformer does not support an attention implementation through torch.nn.functional.scaled_do…
-
Hi
Do you have code examples to showcase section 5.3 in the _Revisiting Deep Learning Models for Tabular Data_ paper? - "Obtaining feature importances from attention maps."
I'm implementing the F…
-
I do not know how to make sense of the error so i paste it here:
Fetching 10 files: 100%|███████████████████████████████████████████████████████████████| 10/10 [04:55
-
Does hls4ml currently support transform architectures now?
I saw in the paper Ultra Fast Transformers on FPGAs for Particle Physics Experiments that MHA support will be made available in the near fut…