-
When FA2 is enabled ("FA2=True" shows up when tuning),
"Unsloth 2024.8: Fast Llama patching. Transformers = 4.44.2.
\\ /| GPU: NVIDIA GeForce RTX 4090. Max memory: 23.617 GB. Platform = Li…
-
I have a conda env where "FA2=True" and another env where "FA2=False" (as dispayed in the terminal when run the finetuning script), the vRAM usuable of tuning the same Gemma 2 model (2b or 9b) are the…
-
# Bug Report
## Description
The problem has been noticed in https://github.com/status-im/status-desktop/pull/16011. In specific combination of SFPM with filters and sorter and Aggregator, the chan…
-
currently on the main branch, we get ~70%.
-
This repo is not maintained anymore, so I've created a repo called [fa2_modified](https://github.com/AminAlam/fa2_modified) which contains the maintained version of fa2. Using this repo, you can insta…
-
Hi, thanks for your work. I'm trying your fa2-cm but it raises error because of the following assertion:
```python
assert do.is_contiguous()
assert q.stride() == k.stride() == v.stride() == o.strid…
-
Are there plans to support FA2.1 contracts in the future? https://github.com/ligolang/generic-fa2.1/tree/main
-
Hello, I'am study fused_multi_head_attention example in CUTLASS.
In CUTLASS 3.5.1 README.md, it said flash attention 2 kernel is in CUTLASS.
But in fused_multi_head attention, it is based on Meta/xFor…
-
It seems that fa2 is not compatible with Python3.9
`ERROR: Command errored out with exit status 1:
command: /usr/local/opt/python@3.9/bin/python3.9 -u -c 'import io, os, sys, setuptools, toke…
-
I use pip install harmonyTS and get the following error:
Building wheels for collected packages: fa2
Building wheel for fa2 (setup.py) ... error
error: subprocess-exited-with-error
× p…