-
# ❓ Questions and Help
I am new to xformers, and I want to speed my Transformer models w/ it. But I found that `xformers` is no speed up compared w/ `scaled_dot_product_attention` from PyTorch. Here …
-
Dear guys,
I found that the position embeddings are concatenated with the word embeddings in the embedding layer.
https://github.com/openai/finetune-transformer-lm/blob/bd1cf7d678926041e6d19193ca…
-
### System Info
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
- `transformers` version: 4.43.1
- Platform: Linux-5.15.0-112-generic-x86_64-with-glibc2.35
-…
-
使用的都是官方链接下载的pytorch版本的bert和albert模型(各3个文件)
transformer版本是跟要求吻合的2.4.1
运行报错:
"Unable to load weights from pytorch checkpoint file. "
OSError: Unable to load weights from pytorch checkpoint fil…
-
I'm running my ComfyUI as a server. I'm using the exact same workflow, and the only thing that changes is the model. Even though I'm using --highvram, the model is still partially reloading when I swa…
-
**System information**
- Alpa version:
- Are you willing to contribute it (Yes/No): Maybe
**Describe the new feature and the current behavior/state**
Most models in the transformers library, l…
-
### 🐛 Describe the bug
when I load model with AutoLigerKernelForCausalLM ,I get ValueError: Pointer argument (at 0) cannot be accessed from Triton (cpu tensor?)
when load mdoel Apply Model…
-
### Reminder
- [X] I have read the README and searched the existing issues.
### System Info
llamafactory version: 0.9.0
Platform: Linux
Python version: 3.10.14
PyTorch version: 2.4.1
Transforme…
-
Here is the full trace of logs:
> Enter a name for your Llama Stack (e.g. my-local-stack): test
> Enter the image type you want your Llama Stack to be built as (docker or conda): docker
Llama S…
-
Does pyodide support Pytorch?
Or does wasm support Pytorch?