issues
search
NetEase-FuXi
/
EETQ
Easy and Efficient Quantization for Transformers
Apache License 2.0
180
stars
14
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Bug: Import of shard_checkpoint from transformers fails
#34
BenjaminBossan
opened
1 day ago
1
torch version issue
#33
sasakiyori
opened
1 week ago
1
aarch64/ arm64 support
#32
khayamgondal
opened
1 month ago
1
My system just updated CUDA to 12.6 and I can no longer compile EETQ. (Python 3.12)
#31
michael-newsrx
opened
3 months ago
0
Unsupported Arch Assertion fail
#30
rahul3161
opened
3 months ago
2
安装之后报错
#29
LChuanwMz
opened
3 months ago
1
EETQ-quantized TrOCR gives nonsense output
#28
donjuanpond
closed
3 months ago
2
EETQ wheel not building
#27
donjuanpond
opened
3 months ago
3
Does it support Whisper model?
#26
kadirnar
closed
3 months ago
2
add qwen2
#25
ehartford
opened
4 months ago
2
add Qwen2
#24
ehartford
opened
4 months ago
6
ImportError: cannot import name 'EetqConfig' from 'transformers', despite using using 4.38.2 which satisfies >=4.27.0
#23
moruga123
closed
4 months ago
2
Repetition with Llama3-70b and EETQ
#22
mjsteele12
closed
2 months ago
2
Does it support Vision Transformers?
#21
PaulaDelgado-Santos
closed
2 months ago
2
Create LICENSE
#20
dtlzhuangz
closed
6 months ago
0
Support CPU quantization
#19
xgal
opened
6 months ago
4
License
#18
AlpinDale
closed
6 months ago
1
Qlora with eetq is quite slow
#17
hjh0119
opened
6 months ago
3
FIX: Use `matmul` instead of `mm` in `backward`
#16
younesbelkada
closed
6 months ago
0
PEFT compatible GEMM
#15
dtlzhuangz
closed
6 months ago
0
how to dequant a EETQ model?
#14
mxjmtxrm
closed
6 months ago
4
Integration with Hugging Face transformers library
#13
younesbelkada
closed
6 months ago
2
Supports H100
#12
mwbyeon
closed
4 months ago
1
Modify code to support CUDA Graph
#11
jacob-crux
closed
9 months ago
1
Quantization takes a very long time
#10
timohear
opened
9 months ago
3
[docs] Update readme
#9
SidaZh
closed
9 months ago
0
Add LoRAX to usage options in README.
#8
arnavgarg1
closed
9 months ago
4
rm dist
#7
dtlzhuangz
closed
10 months ago
0
gemv optimization
#6
dtlzhuangz
closed
10 months ago
0
Understanding EETQ and 8 bit quantization
#5
RonanKMcGovern
closed
11 months ago
3
How to handle bfloat16?
#4
vgoklani
closed
11 months ago
7
Why does EETQ take up all VRAM
#3
RonanKMcGovern
closed
11 months ago
2
安装出错ERROR: Could not build wheels for EETQ, which is required to install pyproject.toml-based projects
#2
linshuijin
closed
1 year ago
5
Question on outlier handling
#1
0xymoro
closed
11 months ago
1