-
did NVIDIA L40 support flash-attention2?
-
![image](https://github.com/rgnarok/Flutterwebsite/assets/121479721/69a5fda3-c201-47fd-a0c1-df82ff909ace)
-
### System Info
- `transformers` version: 4.40.1
- Platform: Windows-10-10.0.22631-SP0
- Python version: 3.11.9
- Huggingface_hub version: 0.22.2
- Safetensors version: 0.4.3
- Accelerate vers…
-
Hi. First of all, thank you so much for this awesome work. I really want to try it out but the problem is that I use a few V100 cards and unfortunately they don't support Flash Attention 2. so I was w…
-
-
*** Support data submitted ***
Id: "902e83eb-d5bc-4469-8adb-b59308ba0655"
-
**Describe the bug**
~~I need to grab a retail video of his full attack, but I'm pretty sure after the camera follows him up to the enemy, the angle then stays at Albert's back the entire time instea…
-
[Flashing by OTA](https://github.com/devbis/z03mmc?tab=readme-ov-file#flashing-over-the-air-easy-way) successfully finished, but the device is not discoverable by zigbee coordinator. Reinstalling the …
-
### Proposal to improve performance
My gpu is tool old so that can't install flash_attn package.
So, I want use vllm.attention.ops.triton_flash_attention replace flash_attn package
### Report of pe…
-
### bug描述 Describe the Bug
Fail to build paddle with branch [release/3.0-beta](https://github.com/PaddlePaddle/Paddle/tree/release/3.0-beta). According to the following error message, the version of …